This is the first in a series of blog posts about the ability to greatly accelerate software delivery pipelines - thus, also accelerating innovation - using the combination of Jenkins, Docker and continuous delivery practices.
Historically we have seen waves of innovation hit the information technology industry. Typically, these waves have happened separately in the areas of infrastructure (mainframe to distributed to virtual), application architecture (monolithic to client-server to n-tier web) and process/methodology (ITIL, for example). But if you look around, you will see that right now we are in the midst of what is not just another wave in one of these areas, but a complete transformation which encompasses all three areas at once. We are watching the infrastructure space be completely disrupted by lightweight container technology (currently best represented by Docker). We are seeing application architectures moving to a distributed microservices model to allow value-added business logic to be quickly added or changed in order to better serve the end-user. Additionally, we are seeing bold new concepts such as, “collaboration,” “failing is okay, just fail fast,” “over communicate with feedback,” take over the IT organization in the hot methodology trends of DevOps and continuous delivery (CD). The really interesting part is that these three waves are feeding on each other and amplifying the ultimate effect on IT: the ability to provide more value faster to the business/consumer/user.
In this series of posts, we will discuss how two of these areas (Docker and CD) are coming together to accelerate the innovation that can happen in microservices-based applications. This radical change in IT tooling and process has the potential to have a huge impact on all of us. The combination of continuous delivery being executed on applications running in Docker containers will allow us to see, in enterprise IT, the exponential growth in innovation that we have seen in consumer and mobile applications over the past five years.
Docker, Jenkins and Continuous Delivery
In just two years, Docker has grown from nothing to more than 100,000 “Dockerized” applications and close to 1,000 contributors. It has heavily rocked the IT boat and pushed the application architecture wave around microservices-based designs to a new reality. Lightweight container technologies - particularly Docker - are rapidly changing the way we build, deliver and update software. Docker brings a new level of simplicity to defining and creating applications or services by encapsulating them in containers. That simplicity lets developers and operations personnel use Docker containers as a common currency, eliminating a source of friction between development and operations. While Docker is an amazing success story, the way people use Docker in their development and delivery processes is still very much a work in progress. Companies using Docker are discovering how it fits within their own environments, as well as how to use Docker and Jenkins together most effectively across their software delivery pipelines.
Meanwhile, since its start a decade ago Jenkins has defined what continuous integration (CI) is all about and is now experiencing tremendous acceleration in adoption as the market moves towards DevOps and continuous delivery (CD). Both of these trends are all about process automation, shared goals and the versioning of everything from code to infrastructure configuration. The “automation of everything” plays to a core strength of Jenkins – its ability to automatically orchestrate any number of processes utilizing any number of technologies, such as Docker, throughout the software lifecycle. Jenkins is currently known to actively run on at least 120,000 clusters, 300,000 servers and offers more than 1,000 plugins providing comprehensive integration with third-party systems.
As the company helping to drive Jenkins adoption across enterprises, CloudBees has been in the middle of the excitement and discovery process around how to best leverage containers. This experience has informed us as to how Jenkins can better enable people to take advantage of Docker and how to transform the excitement in the industry and our hard-won discoveries into concrete features in Jenkins that make the end-to-end software delivery process faster, predictable, manageable and drama-free. That’s what developers and operations people want.
We will take a look at the state of the Docker + Jenkins world today through a series of blog posts. We will cover the following topics:
- Overview of what the combination of Jenkins + Docker + CD practices bring to the table for software delivery
- How people are actually using Jenkins + Docker + CD, together, to speed up software delivery
- What CloudBees and the Jenkins community have done to make Jenkins the obvious choice to use with Docker as the foundation for CI and CD practices.
Finally, we’ll discuss what remains to be done to complete the vision of a world in which Jenkins and Docker, together, bring software development and delivery processes to an entirely new level.
This blog post is authored by the following CloudBees executives:
- Sacha Labourey, CEO
- Dan Juengst, senior director, product marketing
- Steve Harris,advisor
Read the entire series:
- Jenkins, Docker and DevOps: The Innovation Catalysts — Part 1
- Jenkins, Docker and DevOps: The Innovation Catalysts — Part 2
- Jenkins, Docker and DevOps: The Innovation Catalysts — Part 3
- Jenkins, Docker and DevOps: The Innovation Catalysts — Part 4