Jenkins with Docker, Docker-Compose & Docker Swarm TUTORIAL

This tutorial will give you a complete understanding of Jenkins Integration with Docker, Docker-Compose, and Docker Swarm with simple examples:

Jenkins is an open-source automated CI/CD software for DevOps that enables developers to build, test and deploy their software. Docker is an open platform to create, run and deploy applications using containers.

Docker definitions:

Image is a read-only template that contains instructions on how to create containers. Container is an instance of a docker image that contains the code and its dependencies

In this article, we will look at how to create Jenkins pipelines to do the following:

  • Use Jenkins to automate the build of a Docker image and push the image to Docker Hub.
  • Run Jenkins with Docker Compose to run multiple containers using YAML files.
  • Use Jenkins and Docker stack to deploy an application stack to a swarm that consists of multiple nodes as a cluster.

Jenkins with Docker, Docker-Compose & Docker Swarm

Jenkins Integration With Docker, Docker-Compose And Docker Swarm (Stack Deploy)

We will typically follow the below workflow for Jenkins and Docker integration.

workflow for Jenkins and Docker integration

Install Docker Plugins in Jenkins

To get started with integration in Jenkins install the below 2 plugins to dynamically provision a docker container. This is a Jenkins cloud plugin for Docker. You will see step by step on how to make use of the same.

  • Docker plugin
  • Docker Compose Build Step Plugin

Installing the Docker plugins in Jenkins

Set Jenkins slave node on AWS EC2 VM’s

If Jenkins master is on Ubuntu/RHEL and slave nodes are also on Ubuntu/RHEL then you will need to copy the master’s public key (~/.ssh/ to the slave nodes .ssh/authorized_keys file.

Step #1: Log in to the Jenkins MASTER machine and create a ssh key pair.

Use the following command to create a key pair:

# ssh-keygen
# cat .ssh/

Command to create the key-pair

Copy the content and log in to the slave node. Add the copied content to authorized_keys.

# vi .ssh/authorized_keys

From the master machine ssh to the slave node using the below command. It will ask you to accept the ssh fingerprint, type yes and enter. You should be able to ssh into the slave node.

ssh userid@EC2VM-IP-Address

Step #2: Create a new Slave Node

Go to Manage Jenkins -> Manage nodes and clouds

Click on + New Node.

New Node

Click on Create.

Provide a Name and enter the Remote root directory of the slave machine.

Configure Node

Add the Host IP of the Slave machine (EC2 VM) and Add credentials.

Add the Host IP

Add credentials

Username is ubuntu which is the login of the AWS Ec2 VM. In the private key field add the Jenkins master machine private key. You can find the private key in ~/.ssh/id_rsa file.

For e.g., if Jenkins Master is in Windows, here is the file in C:\users\<UserName>\.ssh\id_rsa

Ubuntu Program

Next, add the Host Key Verification Strategy. Select the option Select Known hosts file verification strategy. Click on Save.

Host Verification Key Strategy

Launch the node and you should see a successfully connected message.

Launch the node

Configure Docker Container as Jenkins Build Slave

To configure the docker container as a build slave we will need a machine with Docker installed to perform docker build, launch containers, and push to Docker Hub.

Docker remote API should be configured and enabled for Jenkins to communicate with the machine with Docker installed. This should be done before configuring the agent.

I am using an AWS EC2 Ubuntu VM with Docker installed.

#1) Log in to the EC2 VM and open the docker service file /lib/systemd/system/docker.service

Search for ExecStart and replace that line with the following:

ExecStart=/usr/bin/dockerd -H tcp:// -H unix:///var/run/docker.sock

#2) Reload and restart the docker service

$ sudo systemctl daemon-reload

$ sudo service docker restart

#3) Check and Validate the API by executing the following curl commands

curl http://localhost:2376/version

curl http:// <EC2VM-IP>:2376/version

Step #1: Go to Manage Jenkins -> Manage Nodes and Cloud

Manage Nodes and Cloud

Step #2: Click on Configure Clouds in the left panel

Manage Nodes

Step #3: Click on Add a new cloud -> Docker

Configure Clouds

Click on Docker Cloud details

Step #4: Add Name and URI

Configure CLouds Docker

You can use the “Test connection” to test if Jenkins is able to connect to the Docker host.

Step #5: Add Docker Agent templates

Click on Docker Agent templates

Docker Agent templates

Note the label provided. This will be provided later in the Jenkins job.

Click on Registry Authentication and enter Docker Hub credentials.

Docker Hub credentials

Select the Connect method -> Connect with SSH.

Connect with SSH

For the SSH key select use configured SSH credentials. This was configured in the earlier section

Click on Save.

Create Jenkins job

Create a Jenkins freestyle job and enter the label expression for the project to run and configure the GitHub repo.

Create Jenkins job1

Ensure the Git executables are configured under Manage Jenkins -> Global Tool Configuration

Global Tool Configuration

Next, add the build step Build/Publish Docker Image

  • Under location for Dockerfile enter DOT (.). The Dockerfile which contains commands to build the image is at the root of the GitHub repository
  • Under Cloud select the docker cloud that was added in the previous step
  • Enter the image name
  • Select the check box Push image and select the Registry credentials ( Docker hub credentials)

select the Registry credentials

Click on Save to trigger a build.

Console Output.

Console Output

Once the image is pushed to Docker Hub please check with the registry.

Suggested Read =>> Jenkins Jobs, Types of Jenkins Jobs, Configuring SCM

Jenkins and Docker Compose Integration

Docker Compose is used to run multiple containers using a single service. Docker compose files are written using YAML files and all the services (containers) can be started and stopped with the following commands

docker-compose up -d // To start in detached mode

docker-compose down

Install docker-compose

Use the following command:

sudo curl -L “$(uname -s)-$(uname -m)” -o /usr/local/bin/docker-compose

The command instructs the system to save the file in the /usr/local/bin/ directory, under the name docker-compose. Make the downloaded file executable by changing the file permissions with:

sudo chmod +x /usr/local/bin/docker-compose

Run the command to check the version of docker compose

$ docker–compose –version

Run the Command

Dockerfile and docker-compose.yml


FROM centos
MAINTAINER Vasu Niranjan
RUN mkdir /opt/tomcat
WORKDIR /opt/tomcat
RUN curl -O 
RUN tar -xvzf apache-tomcat-9.0.71.tar.gz
RUN mv apache-tomcat-9.0.71/* /opt/tomcat/.
RUN cd /etc/yum.repos.d/
RUN sed -i 's/mirrorlist/#mirrorlist/g' /etc/yum.repos.d/CentOS-*
RUN sed -i 's|#baseurl=|baseurl=|g' /etc/yum.repos.d/CentOS-*
RUN yum update -y
RUN yum install java -y
RUN java -version
WORKDIR /opt/tomcat/webapps
ADD ["target/HelloWorld-Maven.war", "/opt/tomcat/webapps"]
CMD ["/opt/tomcat/bin/", "run"]

docker-compose.yml file

version: ‘3.8’
build: .
– 9000:8080

Jenkins Job Using Docker Compose Plugin

Let’s look at adding a Build step in Jenkins to call the docker-compose.yml file and start all the services defined in the file as part of the Docker Compose Command. The docker-compose.yml file is present in the root of the GitHub repository.

The rest of the freestyle job configuration for SCM definition remains the same as in the previous section.

Docker Compose Build Setup

Click on Save and trigger a build. Once done, look at the console output and the services running in the docker host.

Save and trigger a build

Run ‘docker ps’ command to look at services as defined in docker-compose.yml file.

Run docker ps

Also Read =>> Jenkins Security, Authentication, Authorization

Jenkins and Docker Swarm Stack Deploy

Docker Swarm is a container orchestration tool that helps in creating and deploying docker nodes as a cluster. Each node of a Swarm is a Docker daemon and interacts using Docker API.

In a Swarm the containers are launched using services which is a group of containers of the same image. There should be at least one node before any service can be deployed on Docker Swarm.

In Docker Swarm, there are 2 nodes.

  • Manager Node: Handles all of the cluster management tasks
  • Worker Node: Receiving and executing tasks from manager node

The service contains tasks that need to be executed on Manager and Worker nodes. In this section, we will look at how using Jenkins as a Docker Stack can be used to deploy multiple services that are primarily containers across different machines. The services run as part of the stack can also be configured across multiple replicas.

Docker stack makes use of a YAML file to deploy multiple services. In this example, I am using the below docker-compose.yml file and a 3 node cluster.

version: ‘3.8’
image: vniranjan1972/hworld_tomcat:V1
– 9000:8080

In the YAML file, I am using a built image since the build command is not supported by docker stack deploy. The Dockerfile is the same as in section 2.2.

In Jenkins, use a Execute shell Build step and add the below commands:

docker login -u “vniranjan1972” -p “<Docker Hub Token>”
docker build -t vniranjan1972/hworld_tomcat:V1 .
docker push vniranjan1972/hworld_tomcat:V1
sleep 5
docker stack deploy -c docker-compose.yml HWorld
docker service ls
docker node ls
docker service scale HWorld_web=3
docker service ps HWorld_web

Save the Job and trigger a build. The job is run on the manager node.

Console Output


Ubuntu Program

As you can see, the service is deployed across the nodes in the cluster.


In this article, we have tried to bring in the simplicity of automating Docker builds, publishing the images to Docker Hub, running multiple containers as a single service on a single host using Docker compose, and lastly running multiple containers on multiple hosts using docker swarm stack deploy.

Further Read => How to Download and Install Jenkins

We hope developers find it useful in their Docker build and deployment automation.