Using the Pipeline Plugin to Accelerate Continuous Delivery -- Part 3

Written by: apemberton
3 min read

Scaling Your Pipeline

As you build more of your DevOps pipelines with Pipeline, your needs will get more complex. The CloudBees Jenkins Platform helps scale Pipeline for more complex uses.

Checkpoints

One powerful aspect of the CloudBees extensions to Pipeline is the checkpoint syntax. Checkpoints allow capturing the workspace state so it can be reused as a starting point for subsequent runs:

checkpoint 'Functional Tests Complete'

Checkpoints are ideal to use after a longer portion of your pipeline has run, for example, a robust functional test suite.

Pipeline Templates

The CloudBees Jenkins Platform has a robust Template feature. CloudBees Jenkins Platform users can create templated build steps, jobs, folders and publishers. Since Pipelines are a new job type, authors can create Pipeline templates so that similar pipelines can simply leverage the same Pipeline job template. More information on Templates is available on the CloudBees' website:

https://cloudbees.com/products/cloudbees-jenkins-platform/enterpriseedition/features/templates-plugin

Tying It Together: Sample Pipeline

The following pipeline is an example tying together several of the Pipeline features we learned earlier. While not exhaustive, it provides a basic but complete pipeline that will help jump-start your pipeline development:

stage 'build'
node {
     git 'https://github.com/cloudbees/todo-api.git'
     withEnv(["PATH+MAVEN=${tool 'm3'}/bin"]) {
          sh "mvn -B –Dmaven.test.failure.ignore=true clean package"
     }
     stash excludes: 'target/', includes: '**', name: 'source'
}
stage 'test'
parallel 'integration': { 
<span style="line-height: 1.6;">     node {
          </span> unstash<span style="line-height: 1.6;"> 'source'
          </span> withEnv<span style="line-height: 1.6;">(["PATH+MAVEN=${tool '</span> m3<span style="line-height: 1.6;">'}/bin"]) {
               </span> sh<span style="line-height: 1.6;"> "</span> mvn<span style="line-height: 1.6;"> clean verify"
          }
     }
}, 'quality': {
     node {
          </span> unstash<span style="line-height: 1.6;"> 'source'
          </span> withEnv<span style="line-height: 1.6;">(["PATH+MAVEN=${tool '</span> m3<span style="line-height: 1.6;">'}/bin"]) {
               </span> sh<span style="line-height: 1.6;"> "</span> mvn<span style="line-height: 1.6;"> sonar:sonar"
          }
     }
}
stage 'approve'
timeout(time: 7, unit: 'DAYS') {
     input message: 'Do you want to deploy?', submitter: 'ops'
}
stage name:'deploy', concurrency: 1
node {
     </span> unstash<span style="line-height: 1.6;"> 'source'
     </span> withEnv<span style="line-height: 1.6;">(["PATH+MAVEN=${tool '</span> m3<span style="line-height: 1.6;">'}/bin"]) {
          </span> sh<span style="line-height: 1.6;"> "</span> mvn<span style="line-height: 1.6;"> cargo:deploy"
     }
}</span> 

Docker with Pipeline

The Docker Pipeline plugin exposes a Docker global variable that provides DSL for common Docker operations, only requiring a Docker client on the executor running the steps (use a label in your node step to target a Docker-enabled agent).

By default, the Docker global variable connects to the local Docker daemon. You may use the docker.withServer step to connect to a remote Docker host. The image step provides a handle to a specific Docker image and allows executing several other image related steps, including the image.inside step. The inside step will start up the specified container and run a block of steps in that container:

<span style="line-height: 1.6;">docker.image('maven:3.3.3-jdk8').inside('-v ~/.m2/repo:/</span> m2repo<span style="line-height: 1.6;">') {
     </span> sh<span style="line-height: 1.6;"> '</span> mvn<span style="line-height: 1.6;"> -Dmaven.repo.local=/</span> m2repo<span style="line-height: 1.6;"> clean package'
}</span> 

When the steps are complete, the container will be stopped and removed. There are many more features of the Docker Pipeline plugin; additional steps are outlined in the Docker example at the end of this blog post.

Extending Pipeline

Like all Jenkins features, Pipeline relies on Jenkins' extensible architecture, allowing developers to extend Pipeline's features.

Plugin Capability

There are a large number of existing plugins for Jenkins. Many of these plugins integrate with Pipeline as build steps, wrappers and so on. Plugin maintainers must ensure their plugins are Pipeline-compatible. The community has documented the steps to ensure compatibility. More details on plugin development and Pipeline compatibility are on the jenkins-ci.org Wiki: https://github.com/jenkinsci/workflow-plugin/blob/controller/COMPATIBILITY.md#plugindeveloper-guide

Custom DSL

Beyond compatibility, plugin maintainers can also add specific Pipeline DSL for their plugins' behavior. The community has documented the steps to take to add plugin-specific DSL. One example is the Credentials Binding plugin, which contributes to the Credentials syntax.

Full Syntax Reference Card

Following is a full Jenkins Pipeline syntax reference card. Of course, as you add plugins or as plugins are updated new Pipeline script elements will become available in your environment. The Pipeline Snippet Generator and UI will automatically add these and any associated help text so you know how to use them!

Basics

Advanced

File System

Flow Control

Docker


Using the Pipeline Plugin to Accelerate Continuous Delivery — Part 1
Using the Pipeline Plugin to Accelerate Continuous Delivery — Part 2
Using the Pipeline Plugin to Accelerate Continuous Delivery — Part 3

Stay up to date

We'll never share your email address and you can opt out at any time, we promise.