Follow Agile with Jenkins CI/CD on AWS

  • by

With the advancement of Cloud Computing, the demand of DevOps has also increased in parallel. Agile software development process has also strengthen it’s foot print. It breaks the project into several stages and involving constant collaboration with stakeholders and continuous improvement and iteration at every stage. To make this process faster, the demand of automated Jenkins CI/CD pipeline has increased day-by-day! Manual testing involves multiple errors and eventually makes the process slow. Today in this article, we will put some light on Jenkins CI/CD on AWS.

But, why Jenkins?

Here are the reasons why you use should use Jenkins pipeline:

Jenkins is one of the most popular Open Source CI tools on the market. It offers support for multiple SCM and many other 3rd party applications using it’s plugins. Jenkins pipeline is implemented as a code which allows multiple users to edit and execute the pipeline process. Pipelines are robust. If your server undergoes an unforeseen restart, the pipeline will be automatically resumed.

Jenkins Pipeline (or simply “Pipeline”) is a suite of plugins which supports implementing and integrating continuous delivery pipelines into Jenkins. A continuous delivery pipeline is an automated expression of your process for getting software from version control right through to your users and customers.

Jenkins works well on AWS and with AWS as it’s available on the AWS Marketplace. Jenkins is widely documented and it is very well integrated. Additionally, Jenkins plugins are available for a number of AWS services.

What is Jenkins Pipeline?

Let’s first understand what is Jenkins Pipeline. In a simple word, pipeline is a set of action that Jenkins will execute in order to test, build and deploy your application to your desired environment. You can define Stages per branches, run commands in parallel, define environment variables and much more.

Jenkins on AWS…

In this article, we will see three simple example of Jenkins CI/CD on AWS.

Deploying a static web application into AWS S3.

Traditional deployment on top of Amazon EC2.

Containerized deployment that leverage AWS ECS

Deploying a static web application into AWS S3.

In order to deploy the static web application to AWS S3, we will first need to setup the build environment. I will consider my local Linux desktop as build server. If you want, you can fork an EC2 instance and make that as a Jenkins server. In order to set that up, you may want to follow my article “AWS EC2 – All that you need to know for a good start with AWS” which will assist you to create EC2 instance. The steps will be similar than that of mine once your EC2 instance is created.

Connect your instance and fork the below command to install and configure Jenkins. We will also instance required packages during this setup. If you are using your local system as Jenkins server like me should also follow the steps mentioned below.

  • Update your system and repositories
$ sudo yum update -y
$ sudo yum install -y docker git
  • Configure yum for Jenkins and install the Jenkins package
$ sudo wget -O /etc/yum.repos.d/jenkins.repo
$ sudo rpm — import
$ sudo yum install jenkins -y
  • Add Jenkins user to the group of ‘root’. To allow Jenkins to build your Docker images, you need to add the Jenkins user to the docker group:
$ sudo usermod -aG root jenkins 
$ sudo usermod -aG docker $USER
$ sudo usermod -aG docker jenkins
  • Add the Jenkins and Docker services to start on boot and start both services:
# Add Jenkins and Docker to startup
$ sudo chkconfig jenkins on
$ sudo chkconfig docker on
$ sudo chkconfig --list | grep -i jenkins
$ sudo chkconfig --list | grep -i docker

# Start Jenkins and Docker as a service
$ sudo service jenkins start
$ sudo service docker start

Configure Jenkins to run in a non standard port. In my case, I configured it on 8090

Here is how you can customize the running port of Jenkins.

# For http:
$ java -jar jenkins.war --httpPort='****'
# For https
$ java -jar jenkins.war --httpsPort='****'

**** must be replaced with the actual port value.

You can now access Jenkins using that port. For me, I configured Jenkins on port 8090. If you have installed Jenkins in your local machine, you would need to type https://localhost:port_number in browser to access Jenkins! For me, it is http://localhost:8090/ I didn’t configure SSL. If you want you can configure SSL.

If you are using AWS EC2 for your Jenkins server, you have 2 option.

Option 1: Allow your port on Security Group of instance so that you can access your Jenkins server. However, I won’t recommend that.

Option 2: You redirect the traffic using iptables.

$ sudo iptables -I INPUT 1 -p tcp --dport 8090 -j ACCEPT
$ sudo iptables -I INPUT 1 -p tcp --dport 80 -j ACCEPT
$ sudo iptables -A PREROUTING -t nat -i eth0 -p tcp --dport 80 -j REDIRECT --to-port 8090

Similarly, for SSL, you need to use 443 instead of 80. However, for SSL, you also need to configure Jenkins. Let me know in comment or through e-mail if you would like to get the steps for SSL integration with Jenkins. I will create a separate article for that.

At this point you should be able to see the Jenkins home page using the public DNS name of your EC2 instance.

When you first access Jenkins, you need to unlock it and need to retrieve your admin password.

You can get the initial password by executing below command:

$ sudo cat /var/lib/jenkins/secrets/initialAdminPassword

Let us now install required plugins and create an admin user.

Install Jenkins Plugins..

At the Jenkins home page on the left menu select Manage Jenkins -> Manage Plugins select the tab Available and search for the following plugins:

  • Pipeline AWS – AWS Integration

Create the Pipeline AWS Jenkins Credential

At the Jenkins home page, on the left menu click at Credentials->System, select the scope global and at the left menu again Add Credential:

You need to provide the AWS IAM Credentials to allow Jenkins Pipeline AWS Plugin to access your S3 Bucket. At the above image, insert the created Access Key ID and the Secret Access Key. Click here to understand the procedure of generating AWS Access Key & AWS Secret Key.

You can define and ID that will be used at the Jenkinsfile configuration as I mention in the following example:

withAWS(region:'<your-bucket-region>',credentials:'<Jenkins-Credential-ID-AWS>') {

Jenkins-Credential-ID is the ID of the credential you’ve just created.

You will now need to create a Jenkinsfile

For your convenience, I am pasting a sample Jenkinsfile below.

pipeline {
  agent {
    docker {
      image 'node:10-alpine'
      args '-p 20001-20100:3000'
  environment {
    CI = 'true'
    HOME = '.'
    npm_config_cache = 'npm-cache'
  stages {
    stage('Install Packages') {
      steps {
        sh 'npm install'
    stage('Test and Build') {
      parallel {
        stage('Run Tests') {
          steps {
            sh 'npm run test'
        stage('Create Build Artifacts') {
          steps {
            sh 'npm run build'
    stage('Deploy to S3') {
      parallel {
        stage('Staging') {
          when {
            branch 'staging'
          steps {
            withAWS(region:'<your-bucket-region>',credentials:'<AWS-Staging-Jenkins-Credential-ID>') {
              s3Delete(bucket: '<bucket-name>', path:'**/*')
              s3Upload(bucket: '<bucket-name>', workingDir:'build', includePathPattern:'**/*');
            mail(subject: 'Staging Build', body: 'New Deployment to Staging', to: '')
        stage('Production') {
          when {
            branch 'master'
          steps {
        slackSend channel: "#<channel>", message: "Deployment Starting: ${env.JOB_NAME}, build number ${env.BUILD_NUMBER} (<${env.BUILD_URL}|Open>)"
        echo 'Deploying to RDG AWS s3 bucket.'
            withAWS(region:'<your-bucket-region>',credentials:'<AWS-Production-Jenkins-Credential-ID>') {
              s3Delete(bucket: '<bucket-name>', path:'**/*')
              s3Upload(bucket: '<bucket-name>', workingDir:'build', includePathPattern:'**/*');
            mail(subject: 'Production Build', body: 'New Deployment to Production', to: '')
 post {
    success {
      slackSend channel: "#<channel>", color: "good", message: "Deployment Complete: ${env.JOB_NAME}, build number ${env.BUILD_NUMBER} (<${env.BUILD_URL}|Open>)"
    failure {
      slackSend channel: "#<channel>", color: "danger", message: "Deployment Failed: ${env.JOB_NAME}, build number ${env.BUILD_NUMBER} (<${env.BUILD_URL}|Open>)"

Your Jenkinsfile have clear 3 stages or instructions.

  1. Install Packages
  2. Test & Build
    1. Create Build Artifacts
    2. Run Tests
  3. Deployment to S3
    1. Staging
    2. Production

The Pipeline mentioned above uses a docker agent to test the application and to create the final build files. The Deployment stage will remove the files from a given S3 bucket, upload the new build files and send notification in Slack and via an email to inform that a new deployment was successfully completed.

One thing to note is at the agent definition it used a port range [args ‘-p 20001-20100:3000’] is used to avoid the job to fail in case a given port is already in use.

At this step, your Jenkins pipeline is ready. Now, you need to integrate it with Git to complete the automation.

Setting up Git Webhook

Webhook is nothing but an HTTP POST. This webhook allows Jenkins to subscribe the repository events like Pushes.

The Web-hook will send a POST request to your Jenkins instance to notify that an action was triggered(PR, Push, Commit…) and it should run a new job over the code changes.

To configure Git Web-hook, go to your repository Settings1, select Webhooks2 on the left menu and click on the button Add webhook3.

Set the Payload URL to your Jenkins_URL/github-webhook/ In my case, it is: http://localhost:8090/github-webhook/ and the Content type to application/json

Select the individual events that you want to be used to trigger a new Jenkins build. I recommend to select Pushes and Pull Requests.

You should see a similar message after a PR is created and it passes Jenkins validation!

Last part that is left is to create s3 bucket and configure it for static website hosting.

Create and configure s3 bucket for static web hosting

  1. Sign in to the AWS Management Console and open the Amazon S3 console at
  2. In the Bucket name list, choose the name of the bucket that you want to enable static website hosting for.
  3. Choose Properties.
  4. Choose Static website hosting
  5. Choose Use this bucket to host a website
  6. Define index.html to point to your entry file like the following image

If you want to use your custom domain name, you need to create a second bucket and under Static website hosting select the Redirect requests option. That option allows you to define the domain that will be used to access the hosting bucket. Remember that this secondary bucket will use a HTTP status code 301 to redirect the requests.

However, you can access your website without custom domain name. You just need to copy the endpoint shown above.

So far we’ve seen first example of Jenkins CI/CD on AWS. We will explore other two examples of Jenkins CI/CD on AWS in the next articles.

Thanks for reading! Please stay tuned.

If you find this article useful, feel free to share it. If you face any issues while following this article, please feel free to contact me. I will be happy to assist you on your queries.


  1. ‘Personal API Tokens’. The GitHub Blog, 17 May 2013,
  2. ‘Pipeline’. Pipeline, Accessed 28 Jan. 2019.
  3. ‘Pipeline as Code’. Pipeline as Code, Accessed 28 Jan. 2019.
  4. CI/CD Pipeline with Jenkins on AWS by Edmar Barros on