Day 2 - Jenkins Hands-On Guide: Learning Jenkins with 30+ Scenarios and End-to-End Project
Welcome to the Jenkins Hands-On Guide! 🎉 Today, we’ll dive deep into Jenkins scenarios, covering a wide range of use cases — from basic configurations to advanced pipelines, cloud integrations, and even an end-to-end CI/CD project that deploys an application on AWS. This guide will help anyone become proficient in Jenkins by following step-by-step instructions for each scenario.
By the end of this hands-on guide, you’ll be able to confidently handle Jenkins configurations, troubleshoot real-world issues, and deploy applications end-to-end using Jenkins.
Environment Setup for Hands-On Jenkins Scenarios
For these scenarios, we’ll be using an AWS EC2 instance as our Jenkins server. Make sure you have set up your environment as outlined in the previous section. The Jenkins server should be accessible at http://<EC2-Public-IP>:8080
.
Beginner Level: 10 Scenarios to Get Started with Jenkins
Scenario 1: Installing Plugins in Jenkins
Objective: Install commonly used Jenkins plugins like Git, Docker Pipeline, and Blue Ocean.
Steps:
Go to Manage Jenkins > Manage Plugins.
Click on Available and search for the plugins:
Git
,Docker Pipeline
,Blue Ocean
.Select the plugins and click Install without restart.
Go to Installed and verify that the plugins are active.
Scenario 2: Creating a Simple Freestyle Job
Objective: Create a basic Freestyle job that prints a message to the console.
Steps:
Go to New Item.
Enter
Simple-Job
as the name and select Freestyle Project.Under Build > Execute Shell, enter:
echo "Hello, Jenkins!"
Click Save and Build Now.
Check the Build History for the output.
Scenario 3: Scheduling a Jenkins Job Using Cron Expressions
Objective: Schedule a job to run every day at 6:00 AM using Jenkins’ build triggers.
Steps:
Go to Configure on the job you created.
Under Build Triggers, select Build periodically.
Enter the cron expression:
H 6 * * *
Save the configuration.
Verify that the job runs automatically at the specified time.
Scenario 4: Using Git in a Jenkins Freestyle Job
Objective: Configure a Jenkins job to pull code from a GitHub repository.
Steps:
Go to Configure on a new or existing job.
Under Source Code Management, select Git.
Enter the repository URL:
https://github.com/yourusername/repository.git
.Save and run the build to check out the code from GitHub.
Scenario 5: Parameterized Jenkins Job
Objective: Create a job that takes user input (e.g., branch name) before execution.
Steps:
Go to New Item and select Freestyle Project.
Under This build is parameterized, add a String Parameter named
BRANCH
.Under Source Code Management, use
$BRANCH
as the branch name.Save and run the job, providing the branch name as input.
Scenario 6: Backup and Restore Jenkins Configuration
Objective: Backup Jenkins configurations and restore them to a new Jenkins instance.
Steps:
Go to the Jenkins home directory:
cd /var/lib/jenkins
Create a backup:
tar -czf jenkins_backup.tar.gz .
To restore, copy the backup file to a new Jenkins instance and extract it in the Jenkins home directory.
Scenario 7: Setting Up Jenkins Job to Poll SCM
Objective: Automatically build a job whenever there is a new commit in GitHub.
Steps:
Go to Configure on the job.
Under Build Triggers, select Poll SCM.
Enter:
H/5 * * * *
Save and make a new commit in the repository to trigger a build.
Scenario 8: Configuring Email Notifications in Jenkins
Objective: Configure Jenkins to send email notifications on build status.
Steps:
Install the Email Extension Plugin.
Go to Configure on the job and add Post-build Actions > Email Notification.
Enter your email address and SMTP settings.
Scenario 9: Jenkins Pipeline Job Using Declarative Syntax
Objective: Create a simple pipeline job that builds and deploys a sample project.
Steps:
Go to New Item and select Pipeline.
Under Pipeline Definition, enter:
pipeline { agent any stages { stage('Build') { steps { echo 'Building the project...' } } stage('Test') { steps { echo 'Running tests...' } } stage('Deploy') { steps { echo 'Deploying the project...' } } } }
Save and run the pipeline to see the staged execution.
Scenario 10: Jenkins Job to Archive Artifacts
Objective: Archive build artifacts and make them available for download.
Steps:
Go to Configure on the job.
Under Post-build Actions, select Archive the artifacts.
Specify the files to archive (e.g.,
*.jar, *.war
).Save and build the job.
Verify that the artifacts are available in the build history.
Intermediate Level: Jenkins Hands-On Scenarios
Scenario 11: Jenkins Multibranch Pipeline Setup
Objective: Set up a Jenkins Multibranch Pipeline job to automatically build multiple branches in a GitHub repository and detect new branches dynamically.
Steps:
Create a GitHub Repository with Multiple Branches:
Create a repository (e.g.,
multibranch-pipeline-demo
) and add multiple branches (main
,develop
,feature1
,feature2
).Add a
Jenkinsfile
to each branch defining the pipeline stages.
Create a Multibranch Pipeline Job in Jenkins:
Go to New Item and select Multibranch Pipeline.
Enter the project name as
Multibranch-Demo
.Under Branch Sources, add the GitHub repository URL.
Add credentials if needed, click Save, and then click on Scan Repository.
Verify Automatic Branch Detection:
Jenkins will automatically scan for all branches containing a
Jenkinsfile
and create separate jobs for each branch.Go to the Jenkins dashboard and view the builds for each branch.
Create a New Branch in GitHub and Verify:
Create a new branch (
feature3
) in GitHub.Jenkins should automatically detect the new branch and build it.
Scenario 12: Setting Up a Jenkins Shared Library
Objective: Use Jenkins Shared Libraries to define reusable pipeline code for multiple projects.
Steps:
Create a Shared Library Repository:
Create a new GitHub repository named
jenkins-shared-library
.Inside the repo, create the folder structure:
vars/
andsrc/
.Add a file named
vars/common.groovy
with a simple reusable function:def sayHello(name) { echo "Hello, ${name}" }
Configure the Shared Library in Jenkins:
Go to Manage Jenkins > Configure System.
Scroll to Global Pipeline Libraries and add a new library.
Enter the name (e.g.,
shared-lib
), select Modern SCM, and provide the GitHub repository URL.
Use the Shared Library in a Jenkinsfile:
Create a new project repository and add the following
Jenkinsfile
:@Library('shared-lib') _ pipeline { agent any stages { stage('Test Shared Library') { steps { common.sayHello('DevOps Engineer') } } } }
Run the Pipeline:
Create a new Pipeline Job in Jenkins and run the pipeline.
Verify that the shared library function is called successfully.
Scenario 13: Deploying a Dockerized Application Using Jenkins
Objective: Build a Docker image using Jenkins and push it to DockerHub.
Steps:
Create a Simple Node.js Application:
Clone the DevOps-Zero-to-Hero repository.
Navigate to
docker-app
directory.
Create a Dockerfile:
Create a file named
Dockerfile
with the following content:FROM node:14 WORKDIR /app COPY . . RUN npm install CMD ["node", "app.js"]
Create a Jenkins Pipeline for Docker Build:
Use the following
Jenkinsfile
:pipeline { agent any environment { DOCKER_IMAGE = "mydockerrepo/app:latest" } stages { stage('Checkout') { steps { git 'https://github.com/yourusername/docker-app.git' } } stage('Build Docker Image') { steps { script { docker.build("${DOCKER_IMAGE}") } } } stage('Push to DockerHub') { steps { script { docker.withRegistry('https://registry.hub.docker.com', 'dockerhub-credentials') { docker.image("${DOCKER_IMAGE}").push() } } } } } }
Run the Jenkins Pipeline:
Create a new Pipeline job, paste the
Jenkinsfile
script, and build the pipeline.Verify that the Docker image is pushed to DockerHub.
Scenario 14: Blue-Green Deployment Using Jenkins Pipelines
Objective: Implement Blue-Green Deployment using Jenkins for zero-downtime releases.
Steps:
Set Up Two Separate Kubernetes Environments:
- Create two Kubernetes namespaces (
blue
andgreen
).
- Create two Kubernetes namespaces (
Create Kubernetes YAML Files:
blue-deployment.yaml
:apiVersion: apps/v1 kind: Deployment metadata: name: blue-deployment namespace: blue spec: replicas: 2 template: metadata: labels: app: myapp spec: containers: - name: app-container image: mydockerrepo/app:blue ports: - containerPort: 8080
green-deployment.yaml
:apiVersion: apps/v1 kind: Deployment metadata: name: green-deployment namespace: green spec: replicas: 2 template: metadata: labels: app: myapp spec: containers: - name: app-container image: mydockerrepo/app:green ports: - containerPort: 8080
Create a Jenkins Pipeline for Blue-Green Deployment:
pipeline { agent any stages { stage('Deploy to Blue') { steps { script { sh 'kubectl apply -f blue-deployment.yaml' } } } stage('Switch Traffic to Green') { steps { script { sh 'kubectl apply -f green-deployment.yaml' } } } } }
Test and Verify the Deployment:
- Verify that the application is first deployed to the blue environment and then switches traffic to the green environment.
Scenario 15: Jenkins Integration with Ansible
Objective: Use Jenkins to automate infrastructure configuration and deployments using Ansible.
Steps:
Set Up Ansible on the Jenkins Server:
Install Ansible on the Jenkins server:
sudo yum install ansible -y
Create an Ansible Playbook:
Create a playbook (
deploy.yaml
) to install Nginx on a remote server:--- - hosts: webserver become: yes tasks: - name: Install Nginx yum: name: nginx state: present
Create a Jenkins Job for Ansible Playbook Execution:
Go to New Item and select Freestyle Project.
Under Build Environment, check Provide Node & Label Parameter Plugin.
Under Build, add an Execute Shell step with the following command:
ansible-playbook -i /path/to/inventory deploy.yaml
Run the Jenkins Job:
Create an inventory file listing the remote server IP addresses under
[webserver]
.Run the Jenkins job and verify that Nginx is installed on the target servers.
Verify the Configuration:
Check the remote server by running:
curl http://<server-ip>
The Nginx welcome page should be displayed.
Scenario 16: Using Jenkins with Kubernetes for Deployment
Objective: Use Jenkins to deploy a Dockerized application to a Kubernetes cluster.
Steps:
Set Up a Kubernetes Cluster:
- Create a Kubernetes cluster using Minikube or AWS EKS.
Create a Jenkins Pipeline for Kubernetes Deployment:
Create a
Jenkinsfile
for deploying the application:pipeline { agent any stages { stage('Checkout') { steps { git 'https://github.com/yourusername/app-repo.git' } } stage('Build Docker Image') { steps { script { docker.build("mydockerrepo/app:latest") } } } stage('Push to DockerHub') { steps { script { docker.withRegistry('https://registry.hub.docker.com', 'dockerhub-credentials') { docker.image("mydockerrepo/app:latest").push() } } } } stage('Deploy to Kubernetes') { steps { script { kubernetesDeploy(configs: 'k8s-deployment.yaml', kubeConfig: '<kubeconfig>') } } } } }
Create a Kubernetes Deployment YAML File:
k8s-deployment.yaml
:apiVersion: apps/v1 kind: Deployment metadata: name: myapp-deployment spec: replicas: 3 selector: matchLabels: app: myapp template: metadata: labels: app: myapp spec: containers: - name: app-container image: mydockerrepo/app:latest ports: - containerPort: 8080
Run the Jenkins Pipeline:
Jenkins will build the Docker image, push it to DockerHub, and deploy it to the Kubernetes cluster.
Verify the deployment using:
kubectl get pods
Check the Application:
- Access the application using the service IP or Ingress.
Scenario 17: Jenkins Backup and Restore Using S3
Objective: Automate Jenkins backup and restore using AWS S3 for disaster recovery.
Steps:
Install the AWS CLI on Jenkins:
sudo yum install aws-cli -y
Create an S3 Bucket for Backups:
- Go to the AWS Console and create an S3 bucket (e.g.,
jenkins-backup-bucket
).
- Go to the AWS Console and create an S3 bucket (e.g.,
Create a Jenkins Job for S3 Backup:
Go to New Item and select Freestyle Project.
Under Build, add a Execute Shell step:
aws s3 sync /var/lib/jenkins s3://jenkins-backup-bucket
Create a Restore Job in Jenkins:
Go to New Item and create another Freestyle Project.
Under Build, add the following command:
aws s3 sync s3://jenkins-backup-bucket /var/lib/jenkins
Verify Backup and Restore:
- Make changes in Jenkins, run the backup job, delete the changes, and restore using the restore job.
Scenario 18: Jenkins Integration with Terraform for IaC
Objective: Use Jenkins to automate infrastructure provisioning using Terraform.
Steps:
Install Terraform on the Jenkins Server:
sudo yum install -y yum-utils sudo yum-config-manager --add-repo https://rpm.releases.hashicorp.com/AmazonLinux/hashicorp.repo sudo yum install terraform -y
Create a Simple Terraform Configuration:
-
provider "aws" { region = "us-east-1" } resource "aws_s3_bucket" "jenkins-bucket" { bucket = "jenkins-demo-bucket" acl = "private" }
-
Create a Jenkins Pipeline for Terraform:
Create a
Jenkinsfile
:pipeline { agent any stages { stage('Terraform Init') { steps { sh 'terraform init' } } stage('Terraform Plan') { steps { sh 'terraform plan -out=plan.out' } } stage('Terraform Apply') { steps { sh 'terraform apply plan.out' } } } }
Run the Jenkins Pipeline:
- Jenkins will initialize Terraform, create a plan, and apply it to provision the S3 bucket.
Verify the Infrastructure:
- Go to the AWS Console and check that the S3 bucket is created.
Scenario 19: Running Jenkins on Docker
Objective: Run Jenkins in a Docker container to simplify environment management.
Steps:
Install Docker on the Server:
sudo yum install docker -y sudo service docker start
Pull the Jenkins Docker Image:
docker pull jenkins/jenkins:lts
Run Jenkins in a Docker Container:
docker run -p 8080:8080 -p 50000:50000 --name myjenkins -d jenkins/jenkins:lts
Access Jenkins:
- Go to
http://<Server-IP>:8080
and complete the initial setup.
- Go to
Persist Jenkins Data:
Create a volume to persist Jenkins data:
docker run -p 8080:8080 -p 50000:50000 -v jenkins_home:/var/jenkins_home --name myjenkins -d jenkins/jenkins:lts
Verify Jenkins is Running:
- Check the container logs and ensure Jenkins is up and running.
Advanced Jenkins Hands-On Scenarios
Scenario 20: Setting Up Jenkins on a Kubernetes Cluster
Objective: Deploy Jenkins on a Kubernetes cluster and configure Jenkins agents for distributed builds.
Steps:
Set Up a Kubernetes Cluster:
- Use Minikube, K3s, or a managed service like AWS EKS or GKE.
Create a Namespace for Jenkins:
kubectl create namespace jenkins
Create a Persistent Volume and Persistent Volume Claim:
jenkins-volume.yaml
:apiVersion: v1 kind: PersistentVolume metadata: name: jenkins-pv namespace: jenkins spec: capacity: storage: 5Gi accessModes: - ReadWriteOnce hostPath: path: "/data/jenkins-volume/" --- apiVersion: v1 kind: PersistentVolumeClaim metadata: name: jenkins-pvc namespace: jenkins spec: accessModes: - ReadWriteOnce resources: requests: storage: 5Gi
Create the PV and PVC:
kubectl apply -f jenkins-volume.yaml
Create a Jenkins Deployment YAML File:
jenkins-deployment.yaml
:apiVersion: apps/v1 kind: Deployment metadata: name: jenkins-deployment namespace: jenkins spec: replicas: 1 selector: matchLabels: app: jenkins template: metadata: labels: app: jenkins spec: containers: - name: jenkins image: jenkins/jenkins:lts ports: - containerPort: 8080 volumeMounts: - name: jenkins-data mountPath: /var/jenkins_home volumes: - name: jenkins-data persistentVolumeClaim: claimName: jenkins-pvc
Create the Jenkins deployment:
kubectl apply -f jenkins-deployment.yaml
Create a Jenkins Service for Access:
jenkins-service.yaml
:apiVersion: v1 kind: Service metadata: name: jenkins-service namespace: jenkins spec: type: NodePort ports: - port: 8080 targetPort: 8080 nodePort: 30000 selector: app: jenkins
Create the service:
kubectl apply -f jenkins-service.yaml
Access Jenkins:
- Go to
http://<Kubernetes-Node-IP>:30000
to access Jenkins.
- Go to
Configure Jenkins Kubernetes Plugin:
Go to Manage Jenkins > Configure System.
Add the Kubernetes plugin and configure the Kubernetes cloud settings to allow Jenkins to spin up dynamic agents on the cluster.
Verify Jenkins Builds with Kubernetes Agents:
Create a new pipeline job and configure it to use the
Kubernetes
agent.Run the job and verify that Jenkins dynamically creates a new pod for the build.
Scenario 21: Implementing Jenkins as Code Using Job DSL
Objective: Use the Jenkins Job DSL plugin to define jobs as code, making it easier to manage and version control Jenkins jobs.
Steps:
Install the Job DSL Plugin:
- Go to Manage Jenkins > Manage Plugins and install the Job DSL Plugin.
Create a New Freestyle Job:
Go to New Item and create a Freestyle Project named
Job-DSL-Example
.Under Build, add a Process Job DSLs step.
Enter the following DSL script:
job('example-job') { description('This is an example job created using Job DSL') scm { git('https://github.com/yourusername/your-repo.git') } triggers { scm('H/15 * * * *') } steps { shell('echo "Building the project..."') } }
Run the Job:
Run the
Job-DSL-Example
job.Go back to the Jenkins dashboard and verify that a new job named
example-job
has been created.
Verify the Job Configuration:
Go to
example-job
and check the SCM configuration and build steps.Modify the DSL script, re-run the job, and observe changes in the created job.
Scenario 22: Jenkins Pipeline for Building and Deploying a Java Web Application
Objective: Create a Jenkins pipeline to build, test, and deploy a Java web application using Maven.
Steps:
Set Up a Sample Java Application:
- Use the Spring PetClinic repository as the sample application.
Create a
Jenkinsfile
for the Maven Build:pipeline { agent any tools { maven 'Maven 3.6.3' } stages { stage('Checkout') { steps { git 'https://github.com/spring-projects/spring-petclinic.git' } } stage('Build') { steps { sh 'mvn clean package' } } stage('Test') { steps { sh 'mvn test' } } stage('Deploy') { steps { echo 'Deploying application...' // Deployment steps can be added here } } } }
Create a New Pipeline Job in Jenkins:
Go to New Item and create a Pipeline Project.
Enter the
Jenkinsfile
content and run the job.
Verify the Build and Test Results:
Check the console output for the
mvn test
results.Validate the created
.war
file in the target directory.
Add Deployment Steps:
- Extend the
Jenkinsfile
to include deployment to Tomcat or a cloud provider.
- Extend the
Scenario 23: Using Jenkins Groovy Scripting for Custom Pipelines
Objective: Use Groovy scripting to create complex Jenkins pipelines with conditional logic and shared variables.
Steps:
Create a Groovy Script in Jenkinsfile:
Use a
Jenkinsfile
with Groovy scripting:def notifySlack(buildStatus) { if (buildStatus == 'SUCCESS') { echo 'Build was successful' } else { echo 'Build failed' } } pipeline { agent any stages { stage('Checkout') { steps { script { git 'https://github.com/yourusername/sample-repo.git' } } } stage('Build') { steps { script { def buildResult = sh(script: 'mvn clean package', returnStatus: true) if (buildResult != 0) { currentBuild.result = 'FAILURE' } } } } } post { always { script { notifySlack(currentBuild.result) } } } }
Run the Jenkins Job:
- Verify the build results and custom notification logic.
Scenario 24: Jenkins with AWS CodeBuild and CodeDeploy
Objective: Use Jenkins to trigger AWS CodeBuild for building the application and AWS CodeDeploy for deploying it to EC2 instances.
Steps:
Create a Buildspec File for AWS CodeBuild:
Create a
buildspec.yml
file in your project root:version: 0.2 phases: install: runtime-versions: java: corretto11 build: commands: - echo "Building the application" - mvn clean package artifacts: files: - target/*.jar
Create a Jenkins Pipeline with AWS Integration:
Use the following
Jenkinsfile
:pipeline { agent any stages { stage('Trigger CodeBuild') { steps { script { awsCodeBuild projectName: 'codebuild-project-name' } } } stage('Deploy to EC2 Using CodeDeploy') { steps { script { awsCodeDeploy applicationName: 'CodeDeployApp', deploymentGroupName: 'CodeDeployGroup', s3Location: 's3://my-bucket/app-latest.zip' } } } } }
Verify the Build and Deployment:
Check the AWS CodeBuild console to verify that the build is triggered.
Go to the CodeDeploy console to view deployment status.
Test the Application on EC2:
- SSH into your EC2 instance and verify that the application is deployed.
Scenario 25: Jenkins Integration with AWS ECS and EKS
Objective: Use Jenkins to build a Docker image, push it to Amazon ECR, and deploy the application to AWS ECS and EKS.
Steps:
Create an ECR Repository:
- Go to the AWS Console and create a repository named
my-app-repo
.
- Go to the AWS Console and create a repository named
Create a Jenkinsfile for Docker Build and Push:
pipeline { agent any environment { AWS_ACCOUNT_ID = 'your-aws-account-id' ECR_REPO_NAME = 'my-app-repo' IMAGE_TAG = 'latest' REGION = 'us-east-1' } stages { stage('Docker Build') { steps { script { docker.build("${AWS_ACCOUNT_ID}.dkr.ecr.${REGION}.amazonaws.com/${ECR_REPO_NAME}:${IMAGE_TAG}") } } } stage('Docker Push') { steps { script { docker.withRegistry("https://${AWS_ACCOUNT_ID}.dkr.ecr.${REGION}.amazonaws.com") { docker.image("${AWS_ACCOUNT_ID}.dkr.ecr.${REGION}.amazonaws.com/${ECR_REPO_NAME}:${IMAGE_TAG}").push() } } } } stage('Deploy to EKS') { steps { sh 'kubectl apply -f k8s-deployment.yaml' } } } }
Create a Kubernetes Deployment YAML File:
k8s-deployment.yaml
:apiVersion: apps/v1 kind: Deployment metadata: name: myapp-deployment spec: replicas: 2 selector: matchLabels: app: myapp template: metadata: labels: app: myapp spec: containers: - name: app-container image: ${AWS_ACCOUNT_ID}.dkr.ecr.${REGION}.amazonaws.com/${ECR_REPO_NAME}:${IMAGE_TAG} ports: - containerPort: 8080
Run the Jenkins Pipeline:
- Jenkins will build the Docker image, push it to ECR, and deploy it to the EKS cluster.
Access the Application:
- Use the Kubernetes service or ingress to access the deployed application.
Scenario 26: Implementing Jenkins HA (High Availability) Using Kubernetes
Objective: Set up Jenkins in a High Availability (HA) configuration using Kubernetes to ensure fault tolerance and scalability.
Steps:
Set Up a Shared File System for Jenkins Home Directory:
- Use AWS EFS, GCP Filestore, or NFS as the shared storage for the Jenkins home directory.
Create a Jenkins Master and Agent Deployment:
jenkins-master-deployment.yaml
:apiVersion: apps/v1 kind: Deployment metadata: name: jenkins-master namespace: jenkins spec: replicas: 2 selector: matchLabels: app: jenkins role: master template: metadata: labels: app: jenkins role: master spec: containers: - name: jenkins image: jenkins/jenkins:lts ports: - containerPort: 8080 volumeMounts: - name: jenkins-home mountPath: /var/jenkins_home volumes: - name: jenkins-home persistentVolumeClaim: claimName: jenkins-pvc
Create a Jenkins Agent Deployment:
jenkins-agent-deployment.yaml
:apiVersion: apps/v1 kind: Deployment metadata: name: jenkins-agent namespace: jenkins spec: replicas: 3 selector: matchLabels: app: jenkins role: agent template: metadata: labels: app: jenkins role: agent spec: containers: - name: jnlp image: jenkins/inbound-agent:latest args: - -url - "http://jenkins-master.jenkins.svc.cluster.local:8080" - ${JENKINS_SECRET} - ${JENKINS_NAME}
Set Up Load Balancer for Jenkins:
Create a service for Jenkins master with type
LoadBalancer
:apiVersion: v1 kind: Service metadata: name: jenkins-service namespace: jenkins spec: type: LoadBalancer ports: - port: 8080 targetPort: 8080 selector: app: jenkins role: master
Access Jenkins and Verify HA:
Check the load balancer URL and observe Jenkins' high availability configuration.
Bring down one Jenkins master pod to verify failover.
Scenario 27: Jenkins Parallel and Sequential Stages with Advanced Groovy Scripting
Objective: Use advanced Groovy scripting to implement parallel and sequential stages in a single Jenkins pipeline.
Steps:
Create a Jenkinsfile with Parallel and Sequential Stages:
pipeline { agent any stages { stage('Build and Test in Parallel') { parallel { stage('Build') { steps { echo 'Building...' sh 'mvn clean install' } } stage('Unit Test') { steps { echo 'Running Unit Tests...' sh 'mvn test' } } } } stage('Integration Testing') { steps { echo 'Running Integration Tests...' sh 'mvn verify' } } } }
Run the Jenkins Pipeline:
- Verify that the build and test stages are executed in parallel, while the integration test runs sequentially afterward.
Visualize the Pipeline:
- Use Jenkins Blue Ocean to visualize the parallel and sequential stages.
Let's build a complete end-to-end project using Jenkins, integrating various tools like Git, Docker, Kubernetes, and AWS. This will cover an entire CI/CD pipeline from code commit to deployment, ensuring zero downtime with advanced deployment strategies.
Project Overview:
We’ll build a Node.js Application, use Jenkins to create a CI/CD pipeline, integrate with Docker for containerization, use Kubernetes for orchestration, and deploy the final application to an AWS EKS Cluster.
Step-by-Step Guide:
1. Create a Sample Node.js Application
Create a GitHub Repository:
Name the repository
complete-ci-cd-pipeline
.Initialize it with a README and a
.gitignore
file for Node.js.
Create the Application Code:
Clone the repository locally:
git clone https://github.com/yourusername/complete-ci-cd-pipeline.git
Navigate to the cloned repository:
cd complete-ci-cd-pipeline
Create a new file named
app.js
:const http = require('http'); const port = process.env.PORT || 3000; const server = http.createServer((req, res) => { res.statusCode = 200; res.setHeader('Content-Type', 'text/plain'); res.end('Hello, World! This is a CI/CD Pipeline project using Jenkins, Docker, and Kubernetes.'); }); server.listen(port, () => { console.log(`Server running at port ${port}`); });
Create a
package.json
file:{ "name": "complete-ci-cd-pipeline", "version": "1.0.0", "description": "Node.js project for CI/CD Pipeline", "main": "app.js", "scripts": { "start": "node app.js" }, "author": "Your Name", "license": "ISC", "dependencies": { "express": "^4.17.1" } }
Add and commit the changes:
git add . git commit -m "Initial commit with Node.js application" git push origin main
2. Create a Dockerfile for Containerization
Create a new file named
Dockerfile
in the project root:# Use the official Node.js base image FROM node:14 # Set the working directory inside the container WORKDIR /app # Copy the package.json and install dependencies COPY package.json . RUN npm install # Copy the rest of the application code COPY . . # Expose the application port EXPOSE 3000 # Start the application CMD ["npm", "start"]
Test the Docker Build Locally:
Run the following commands to build and test the Docker image locally:
docker build -t complete-ci-cd-pipeline . docker run -p 3000:3000 complete-ci-cd-pipeline
Open your browser and go to
http://localhost:3000
. You should see the message:Hello, World! This is a CI/CD Pipeline project using Jenkins, Docker, and Kubernetes.
3. Create a Jenkins Pipeline for CI/CD
Create a
Jenkinsfile
in the project root with the following content:pipeline { agent any environment { DOCKERHUB_CREDENTIALS = credentials('dockerhub-credentials') ECR_REPO_NAME = 'complete-ci-cd-pipeline' AWS_ACCOUNT_ID = 'your-aws-account-id' REGION = 'us-east-1' IMAGE_TAG = 'latest' } stages { stage('Checkout') { steps { git 'https://github.com/yourusername/complete-ci-cd-pipeline.git' } } stage('Build Docker Image') { steps { script { docker.build("${AWS_ACCOUNT_ID}.dkr.ecr.${REGION}.amazonaws.com/${ECR_REPO_NAME}:${IMAGE_TAG}") } } } stage('Push Docker Image') { steps { script { docker.withRegistry("https://${AWS_ACCOUNT_ID}.dkr.ecr.${REGION}.amazonaws.com") { docker.image("${AWS_ACCOUNT_ID}.dkr.ecr.${REGION}.amazonaws.com/${ECR_REPO_NAME}:${IMAGE_TAG}").push() } } } } stage('Deploy to Kubernetes') { steps { kubernetesDeploy(configs: 'k8s/deployment.yaml', kubeConfig: '<kubeconfig-path>') } } } post { success { echo 'CI/CD Pipeline executed successfully!' } failure { echo 'Pipeline failed. Please check the logs.' } } }
Create a
deployment.yaml
File for Kubernetes Deployment:Create a folder named
k8s
and add adeployment.yaml
file with the following content:apiVersion: apps/v1 kind: Deployment metadata: name: ci-cd-deployment spec: replicas: 3 selector: matchLabels: app: ci-cd-app template: metadata: labels: app: ci-cd-app spec: containers: - name: ci-cd-app image: ${AWS_ACCOUNT_ID}.dkr.ecr.${REGION}.amazonaws.com/${ECR_REPO_NAME}:${IMAGE_TAG} ports: - containerPort: 3000
Set Up Jenkins:
Create a Pipeline Job in Jenkins and link it to the
Jenkinsfile
in the GitHub repository.Make sure you have the Docker and Kubernetes plugins installed in Jenkins.
Run the Jenkins Pipeline:
Run the pipeline and watch the stages execute: Checkout, Build Docker Image, Push Docker Image, and Deploy to Kubernetes.
Jenkins will build the Docker image, push it to AWS ECR, and deploy the application to your Kubernetes cluster.
Verify the Deployment:
Use the following commands to verify that the application is running in your Kubernetes cluster:
kubectl get deployments kubectl get pods
Access the application using the service IP or Ingress.
4. Implement Blue-Green Deployment with Jenkins
Extend the Kubernetes Deployment YAML File:
- Add configurations for
blue
andgreen
deployments using Kubernetes services.
- Add configurations for
Modify the Jenkinsfile to Handle Blue-Green Strategy:
- Use environment variables to determine which environment is active and switch between
blue
andgreen
deployments.
- Use environment variables to determine which environment is active and switch between
Add Steps in Jenkinsfile for Traffic Switching:
- Use Jenkins to automate traffic switching between the environments, ensuring zero-downtime deployments.
5. Conclusion: Jenkins End-to-End Project
Congratulations! 🎉 You’ve built an end-to-end CI/CD pipeline using Jenkins that involves building a Docker image, pushing it to AWS ECR, and deploying it to a Kubernetes cluster using Jenkins pipelines.
This project covers essential DevOps practices and showcases how Jenkins integrates with Docker, Kubernetes, and AWS to create a complete CI/CD workflow.
For more insights and updates, connect with me on LinkedIn.
Day 2 Conclusion
Congratulations on completing Day 2 of our 15-Day DevOps Interview Preparation Series! Today, we covered a comprehensive set of Jenkins scenarios, starting from beginner to advanced topics. We explored Jenkins pipelines, integration with Docker, Kubernetes, AWS services, and more complex use cases.
Key Takeaways:
Basic Configurations: Plugin management, backup, scheduling jobs, and freestyle projects.
Intermediate Scenarios: Multibranch pipelines, Blue-Green deployments, Jenkins shared libraries.
Advanced Use Cases: AWS integrations, Kubernetes HA, Groovy scripting, and parallel pipelines.
In Day 3, we’ll dive into Ansible — exploring scenarios for configuration management, orchestration, and automating cloud environments!