A Docker Delivery Pipeline for You to Try Out

Written by: Michael Neale
3 min read
Stay connected

I am speaking at DockerCon next month, and as such wanted a little demo for people to try out - something self contained.

In this case - I setup a simple pipeline (really a bunch of jobs that depend on each other with triggers, with a delivery pipeline view) - this involved building and handling a docker image (testing, promoting etc). I also wanted to show other ways in which docker can be used as part of a build. Basically - a bunch of pre-configured jobs and things setup ready to go. To start the pipeline - kick off "App Build and Test" - then flick to the pipeline view tab.

As quite a few plugins and configuration steps are involved in getting Jenkins to this point - I wanted to bundle it all up as a one liner people could run - like a pre-made Jenkins distribution - using docker of course.

Is all you need to get going: browse to localhost:8080 and you are good to go - no more configuration or installation, jobs will be set up etc. **

The aim here is to show a fictional pipeline (most of the jobs are doing very little, one of them uses the docker-build-publish plugin). Building and publishing Docker images as "an application" is an appropriate use of Docker. In this case the Dockerfile is pretty central to defining the build, everything revolves around it (even if another script is used to bootstrap the process).

You may also use Docker as a containment mechanism for builds - this is currently probably one of the more popular uses of Docker. As it is quite a new technology, it takes a while before people roll it out in to production, but in the meantime it can serve very well as a means to get a known environment for running predicable builds. See the job called "build inside a container" as a simple example of this (it uses the repo's Dockerfile to create an environment to run the build in).

Have a look (read more here ) and let me know what you think.

I am fascinated in docker as a delivery mechanism for server software - it is quite a time saver. In this case - I was able to use a base jenkins/docker "trusted build", setup Jenkins how I want, and then cp the "jenkins_home" from a running container to a new image build on a base image I maintain (michaelneale/jenkins-docker-executors). This is a neat and repeatable way to package up setups that you know are going to work when people start it up.

Read on .

** This actually runs docker inside docker. You can also run it by "bind mounting" in the docker socket - for a more production-like way of using docker if you need to. It is quite a reasonable thing to bind mount in the docker socket, and let a process inside a container control the docker daemon on the host.

@michaelneale (twitter)

Stay up to date

We'll never share your email address and you can opt out at any time, we promise.