Cross Team Collaboration (Part 2)

Written by: Michael Cirioli
8 min read

(Part 2 of two blog posts )

This is the second part of two-part blog post on cross team collaboration (see part 1 ). This feature greatly simplifies connecting pipelines together, increasing automation and collaboration. In Part two, we will cover how to add logic to the trigger mechanism through the use of JMESPath queries.

The power of JMESPath

Triggering builds based on matching simple event strings is a powerful tool for decoupling your upstream jobs from their downstream dependencies, but that just scratches the surface of what is possible with the help of JMESPath expressions. In the real world there may be many downstream jobs which are dependent on a particular artifact. Building on our fictional com.example:supply-chain-lib example in part one of the blog post, let’s imagine there are two downstream pipelines that depend on this jar. The first is a “bleeding edge” pipeline, and it should be built any time a new -SNAPSHOT version of supply-chain-lib is published. The other pipeline is your “production” pipeline, and should only be built when a non-SNAPSHOT version of supply-chain-lib has been released. Normally you do not want to update your triggers every time the upstream job changes its version number. This can be done using a different type of eventTrigger , jmespathQuery .

...but first, some background

JMESPath is a query language to search within JSON objects, used by tools such as the AWS and OpenShift CLI’s to filter their verbose output. In order to understand how this can be used to solve our real world scenario we must first take a look at what those event messages really look like under the hood.

Behind the scenes, pipeline events are represented using a JSON data structure. A simpleEvent is really just a convenient way to create a JSON object with a simple event value. For instance, the message from our previous example:

publishEvent simpleEvent(‘com.example:supply-chain-lib:1.1-SNAPSHOT:jar’)

actually results in the following JSON object being published with the event:

{
"event": "com.example:supply-chain-lib:1.1-SNAPSHOT:jar",
"source": {
"type": "JenkinsTeamBuild",
"buildInfo": {
"build": 14,
"job": "supply-chain-lib",
"jenkinsUrl": "https://zoolander.beescloud.com/teams-supply-chain/",
"instanceId": "b5d5e0e9de1f35d9e2d3815265e069d4",
"team": "supply-chain"
}
}
}

There is no inherent schema required for an event message, aside from the fact that it must be a valid JSON object and that Jenkins is using the “source” key as reserved event attribute. That being said, there are a couple of conventions to be aware of. First, since a message must be a valid JSON object, this means that our simple message must be part of a name/value pair. Since we only need to supply the event value in our simpleEvent , Jenkins automatically binds it to an attribute named “event ”. As you will see in a bit, it is also possible to publish your own custom JSON events with any schema of your choice. It is recommended (but not required) that you include an “event ” attribute with a string value in order to support downstream projects/dependencies that prefer to use a simpleMatch as a trigger. In addition, Jenkins also adds supplemental information about the job that published the event under the “source” attribute. This attribute name is reserved by Jenkins, and will be overwritten if you include it at the root level of your own schema.

Triggering events using a JMESPath query

Now that we have a better understanding of what an event message actually looks like, we can begin to see how we can write a query that will trigger based on certain attributes and values. For our ‘bleeding edge’ pipeline we would like to trigger a build anytime we see that an event containing a SNAPSHOT of the supply-chain-lib is published. To do that, we will use the jmespathQuery event trigger:

eventTrigger jmespathQuery(“contains(event,’com.example:supply-chain-lib:’ && '-SNAPSHOT')”)

Let’s break this down so we can understand how it works.

  • eventTrigger - this is just our normal way of beginning a pipeline trigger

  • jmespathQuery - new pipeline syntax that says you want to use a query for your trigger condition

  • contains(event,’com.example:supply-chain-lib’ && ‘-SNAPSHOT’) - This is the actual query we want to run against the JSON event message. The contains function is used to test if an element contains a substring. In this case, we want to test if the event attribute contains certain strings. The contains function takes two arguments, the first being the path of the attribute we want to check. The second argument is the string we want to test for. For our ‘bleeding edge’ build, we want to listen for SNAPSHOT builds (any version) of the supply-chain-lib artifact, so we look for the presence of two strings: “com.example:supply-chain-lib:” and “-SNAPSHOT”.

    • Important - the use of single quotes inside the JMESPath query is required!

The complete declarative pipeline code for this job looks like this:

pipeline {
agent any
triggers {
eventTrigger jmespathQuery(“contains(event,’com.example:supply-chain-lib:’ && ‘-SNAPSHOT’)”))
}
stages {
stage(build) {
steps {
echo 'a SNAPSHOT build of com.example:supply-chain-lib: was seen!'
}
}
}
}

Similarly, for our production build, we can use similar query, but check that it does not contain ‘-SNAPSHOT’ :

pipeline {
agent any
triggers {
eventTrigger jmespathQuery(“contains(event,’com.example:supply-chain-lib:’) && !contains(event, '-SNAPSHOT')”)
}
stages {
stage(build) {
steps {
echo 'build of com.example:supply-chain-lib:1.1-SNAPSHOT:jar was seen!'
}
}
}
}

These examples are just the tip of the iceberg when it comes to the power of JMESPath queries. We will explore a few more examples below, but for more details please checkout http://jmespath.org/ for full documentation and examples. The site also has a handy sandbox feature where you can play around with queries using your own JSON data!

The final piece - complex JSON events

As we saw in the last section, you can use a jmespathQuery to create an intelligent trigger for a simpleEvent . It is also possible to create more complex events by supplying your own JSON schema by using a jsonEvent which takes a JSON string as its argument:

publishEvent jsonEvent(‘{“event”:”com.example:supply-chain-lib:1.1-SNAPSHOT:jar”}’)

You might recognize that the JSON in this example looks very much like the JSON message from our simpleEvent example, and you’d be right! Because all event messages are represented as a JSON schema, the above jsonEvent is functionally equivalent to this simpleEvent :

publishEvent simpleEvent(‘com.example:supply-chain-lib:1.1-SNAPSHOT:jar’)

Furthermore, you can trigger on this event using a simpleMatch . Recall that Jenkins transforms simpleEvents into a JSON representation by assigning the value of the string argument of simpleEvent to the JSON attribute “event ” when the message is published. As long as your jsonEvent has this attribute with a string value, downstream teams will be able to use a simpleMatch . It is generally recommended to do this as a best practice when creating JSON events so that downstream teams have as much flexibility as possible when creating their triggers.

Now that we understand the basics, let’s build upon our supply-chain-lib example to create a more complex scenario. Instead of building a single artifact, the Supply Chain team now maintains a complex Maven build which produces a number of artifacts using several distinct pipelines. For example, the following JSON event is published when a new -SNAPSHOT build of the supplyChain.war succeeds:

pipeline {
agent any
stages {
stage('build') {
steps {
echo 'building com.example:supplyChain:1.2-SNAPSHOT:war'
}
}
stage('notify') {
steps {
echo 'a new build of com.example:supplyChain:1.2-SNAPSHOT:war succeeded!'
publishEvent jsonEvent(' ```
{
"event": "com.example:supply-chain-webapp:1.2-SNAPSHOT:war",
"mavenArtifacts": [
{
"groupId": "com.example",
"artifactId": "supply-chain-tools",
"version": "2.1-SNAPSHOT",
"type": "jar"
},
{
"groupId": "com.example",
"artifactId": "supply-chain-lib",
"version": "1.0",
"type": "jar"
},
{
"groupId": "com.example",
"artifactId": "supplyChain",
"version": "1.0-SNAPSHOT",
"type": "war"
}
]
}
``` ')
}
}
}
}

Additionally, remember that each pipeline will add extra information about the job that published this event. Armed with that knowledge, we can start to write JMESPath queries for our downstream jobs!

Example 1

The Supply Chain team wants to trigger a build when a new build of supplyChain.war (both -SNAPSHOT and production versions) are available from their internal pipeline.

The JSON that is published for this event is:

{
"event": "com.example:supply-chain-webapp:1.2-SNAPSHOT:war",
"mavenArtifacts": [
{
"groupId": "com.example",
"artifactId": "supply-chain-tools",
"version": "2.1-SNAPSHOT",
"type": "jar"
},
{
"groupId": "com.example",
"artifactId": "supply-chain-lib",
"version": "1.0",
"type": "jar"
},
{
"groupId": "com.example",
"artifactId": "supplyChain",
"version": "1.0-SNAPSHOT",
"type": "war"
}
],
"source": {
"type": "JenkinsTeamBuild",
"buildInfo": {
"build": 9,
"job": "supplyChain_internal",
"jenkinsUrl": "https://zoolander.beescloud.com/teams-supplychainteam/",
"instanceId": "27d1b07e4739d2db0dc99e5ef8a84b98",
"team": "SupplyChainTeam"
}
}
}

The trigger would look like:

triggerEvent jmespathQuery(“(mavenArtifacts.[*].artifactId == "supplyChain") && (source.buildInfo.job.contains(@,'supplyChain_internal')”)

What this query does is to look at the mavenArtifacts array for an object which has an artifactId equal to “supplyChain”, AND if the event also originates from the “supplyChain_internal” job, then the downstream pipeline will be triggered.

Conclusion

In this two-part blog, we have learned how cross team collaboration can reduce handoffs by increasing automation and collaboration between teams. JMESPath enables users to have granular control on when pipelines can be triggered. As a best practice, be mindful of where in your pipeline you publish your events. For example, don’t publish before your build has completed successfully. Also, you can have multiple publish events in a pipeline, but no more than one trigger per pipeline. Now is the time to implement cross team collaboration in your organization.

Useful Links

Stay up to date

We'll never share your email address and you can opt out at any time, we promise.