How to use Jenkins to build automatically on Ubuntu

Jenkins is an open source automation server that allows you to build pipelines to automate the process of building, testing, and deploying applications. In this guide, you will implement basic workflows to speed up the continuous integration and continuous delivery (CI/CD) process.

ready##

sudo apt-get update && sudo apt-get upgrade

Note that this tutorial is written for non-root users. Commands that require elevated privileges are prefixed with sudo.

Initial assumptions##

This guide is aimed at DevOps professionals and therefore assumes:

  1. The local workstation will be used for development and testing.
  2. Linode will be used for remote Jenkins server.
  3. Both will use Ubuntu 16.04.
  4. Jenkins will be used mainly through the newer [Blue Ocean] (https://jenkins.io/projects/blueocean/) web interface.
  5. Both workstations and remote Linodes need to have Docker installed in advance. For detailed instructions, please refer to our [How to install docker image] (https://www.linode.com/docs/applications/containers/how-to-install-docker-and-pull-images-for-container-deployment/) guide.
  6. For the purpose of this guide, only the Jenkins master server is used.
  7. You will need an already created GitHub account, or similar programs available for Bitbucket and GitLab.
  8. You will also need a Docker Hub or similar registered account.

Understand how Jenkins works##

Before automating the workflow, it is necessary to understand the basic CI/CD process. The following figure illustrates this:

The most basic process consists of three stages: build, test, and deploy. Every time a change is made on the distributed version control system, an automation loop is triggered on the Jenkins server. The complete set of instructions for running the process Jenkinsfile is located in the root directory of the source repository. This single file tells the server what to do, when and how to perform these tasks.

Write a Node.js application example##

As mentioned in the previous section, the automated process first submits to the version control system.

Create a new repository in GitHub. This guide will use a simple Node.js application to demonstrate how the Jenkins pipeline works. Select the corresponding .gitignore, and don't forget to initialize it with the following content README:

Clone the new repository to the local workstation:

 git clone [email protected]:<GITHUB_USERNAME>/jenkins-guide.git

Open your favorite text editor and create the file app.js in the root directory of the repository. Add the following content:

~ /jenkins-guide/app.js

1 2 3 4 5 6 7 8 9101112131415161718192021 ' use strict';const express = require('express');const app = express();// Server connectionconst PORT = 9000;const HOST = '0.0.0.0';// Application contentconst os = ['Windows','macOS','Linux']// Web Serverapp.get('/',function(req,res) { res.json(os);});// Console outputapp.listen(PORT, HOST);console.log(Running on http://${HOST}:${PORT});

This application uses the Express web server to provide a single JSON output to the browser on port 9000. Next, save test.js to the same location in the root of the repository.

~ /jenkins-guide/ test.js

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829 var supertest = require("supertest");var should = require("should");var server = supertest.agent("http://nodeapp-dev:9000");// Unit Testdescribe("Webapp Status",function(){ // Test 1 - HTTP status it("Expect HTTP status 200",function(done){ server .get("/") .expect("Content-type",/text/) .expect(200) .end(function(err,res){ res.status.should.equal(200); done(); }); }); // Test 2 - Control Tests it("Mocha Control Test",function(done){ (1).should.be.exactly(1).and.be.a.Number(); done(); });});

This is a simplified test suite using supertest and should. It only has two tests: the first one checks the HTTP status, and it is expected to be 200. The second one is not a real test, but a control that always passes.

This example will use two Docker containers, one for app.js using Express and the other for the test suite using Mocha. Each image has its own folder, which contains the corresponding Dockerfile and package.json.

FROM node:6-alpine

# Create a server directory
RUN mkdir -p /home/node/app
WORKDIR /home/node/app

# Install server dependencies
COPY /express-image/package.json /home/node/app
RUN npm install

# Copy node Application
COPY app.js /home/node/app

# Open port
EXPOSE 9000

CMD ["npm","start"]

App.js runs this image by default when it starts. You can think of it as the "dockerized" version of the web application.

{" name":"express-image","version":"1.0.0","description":"Example Node Application","author":"Your name","repository":{"type":"git","url":"git+https://github.com/<YOUR_USERNAME>/<REPOSITORY_NAME>.git"},"license":"ISC","scripts":{"start":"node app.js"},"dependencies":{"express":"^4.13.3"}}
FROM node:6-alpine

# Create feports directory
RUN mkdir -p /JUnit

# Create a server directory
RUN mkdir -p /home/node/tests
WORKDIR /home/node/tests

# Install app dependencies
COPY /test-image/package.json /home/node/tests
RUN npm install

# Copy test source
COPY test.js /home/node/tests

EXPOSE 9000

CMD ["npm","test"]

This image creates a reports folder and installs the dependency package.json from it. In the beginning, it performs Mocha testing.

This JSON file contains all the required dependencies, including mocha-junit-reporter Jenkins will be used to test the storage required dependencies. Please note that the test script is configured with the option Dockerfile for mochaFile to use the image report folder specified in the image. Your final project distribution will be similar to:

**Note: **The method of folder structure and the implementation of two Docker containers are unusual, but for teaching reasons they are used to demonstrate Jenkins Pipeline functionality.

Run your application manually

Before starting the real automation process, you first need to understand the content to be automated.

You have just completed the entire build, test and deploy process of this fictional web application. Now it's time to automate.

Install Jenkins and Blue Ocean

Jenkins provides many installation options:

Install Jenkins

Using packages maintained by the Jenkins project allows you to use a newer version than the version included in the distribution package manager.

  1. Download and add the repository key of the current stable version of Jenkins: wget -q -O-https://pkg.jenkins.io/debian-stable/jenkins.io.key | sudo apt-key add-
  2. Add the new repository to your sources.list: sudo sh -c'echo deb http://pkg.jenkins.io/debian-stable binary/> /etc/apt/sources.list.d/jenkins. list'
  3. Update your system and install Jenkins: sudo apt update sudo apt install jenkins
  4. Now that you have installed Jenkins, you need to grant its users permission to run Docker commands: sudo usermod -aG docker jenkins
  5. Control your daemon use is very simple: sudo service jenkins and select start, stop, restart, or status. Start your service to check the installation: sudo service jenkins start
  6. If everything is ok, please enable the service at startup. sudo systemctl enable jenkins
  7. Use Linode Manager to restart the server to apply these changes. Warning: Establishing security parameters for Jenkins remote installation is beyond the scope of this guide. However, please pay attention to these key points that need to be addressed in the production environment:

Setting up Jenkins

Scripting and declarative pipeline syntax##

Jenkins provides two different options for the Jenkinsfile syntax:

Both support continuous delivery and Jenkins plugins. The script syntax is based on the Groovy programming environment, so it is more complete. On the other hand, the declarative grammar "was created to provide a simpler and more insightful grammar to author Jenkins pipelines", so it is suitable for daily automated construction. You can learn more about syntax comparison in Jenkins Documentation.

This guide will use Declarative syntax to illustrate the Jenkins process, because its design is easier to implement and understand.

Jenkinsfile structure##

The declarative pipeline syntax is very intuitive. The most basic layout is similar to the layout shown below:

The code blocks are separated by curly braces ({ and }) and do not use semicolons. Each statement must be on its own line, and Jenkinsfile is the core of the steps you perform. Some common steps are:

All these operations can be performed internally, the agent or you can instruct Jenkins to perform any operations remotely via SSH. As you can see, there are endless possibilities for automation. In a simple scenario, only one pipeline that executes its stages sequentially is sufficient to achieve the desired final state, but you can define the pipeline to run in parallel when needed. For more information about Jenkins's declarative pipeline syntax, please refer to Official Document.

Start using Pipelines

From here, you can get the following valuable information: 1) your build number, 2) the console output of each step, 3) select the stage for further analysis, 4) browse the tabs, which contains information about submitting changes, testing Information about results and stored artifacts, 5) replay your build, 6) visually edit the pipeline, 7) go to your pipeline settings.

Use Jenkins to automate the whole process###

The Jenkinsfile template uses a very basic pipeline structure with only three stages. You can customize it to suit multiple stages as needed. The final pipeline structure is determined by the complexity of the project and the development guidelines you must follow. Now that you know the Node.js example, you know how to design a pipeline that automates each stage. For the purposes of this guide, the final pipeline should:

Submit changes to Pipeline###

First edit the Jenkinsfile and paste the following pipeline. Replace<DockerHub Username> For your own information.

~/Jenkins Guide/Jenkinsfile

|1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99100101| pipeline { environment { DOCKER = credentials('docker-hub') } agent any stages {// Building your Test Images stage('BUILD') { parallel { stage('Express Image') { steps { sh 'docker build -f express-image/Dockerfile \ -t nodeapp-dev:trunk .' } } stage('Test-Unit Image') { steps { sh 'docker build -f test-image/Dockerfile \ -t test-image:latest .' } } } post { failure { echo 'This build has failed. See logs for details.' } } }// Performing Software Tests stage('TEST') { parallel { stage('Mocha Tests') { steps { sh 'docker run --name nodeapp-dev --network="bridge" -d \ -p 9000:9000 nodeapp-dev:trunk' sh 'docker run --name test-image -v $PWD:/JUnit --network="bridge" \ --link=nodeapp-dev -d -p 9001:9000 \ test-image:latest' } } stage('Quality Tests') { steps { sh 'docker login --username $DOCKER_USR --password $DOCKER_PSW' sh 'docker tag nodeapp-dev:trunk /nodeapp-dev:latest' sh 'docker push /nodeapp-dev:latest' } } } post { success { echo 'Build succeeded.' } unstable { echo 'This build returned an unstable status.' } failure { echo 'This build has failed. See logs for details.' } } }// Deploying your Software stage('DEPLOY') { when { branch 'master' //only run these steps on the master branch } steps { retry(3) { timeout(time:10, unit: 'MINUTES') { sh 'docker tag nodeapp-dev:trunk /nodeapp-prod:latest' sh 'docker push /nodeapp-prod:latest' sh 'docker save /nodeapp-prod:latest | gzip > nodeapp-prod-golden.tar.gz' } } } post { failure { sh 'docker stop nodeapp-dev test-image' sh 'docker system prune -f' deleteDir() } } }// JUnit reports and artifacts saving stage('REPORTS') { steps { junit 'reports.xml' archiveArtifacts(artifacts: 'reports.xml', allowEmptyArchive: true) archiveArtifacts(artifacts: 'nodeapp-prod-golden.tar.gz', allowEmptyArchive: true) } }// Doing containers clean-up to avoid conflicts in future builds stage('CLEAN-UP') { steps { sh 'docker stop nodeapp-dev test-image' sh 'docker system prune -f' deleteDir() } } }}|
|--------|--------|

This complete Jenkinsfile is written using declarative syntax. If you read it carefully, you will notice that it describes the same process used during application deployment in the previous section. This section will analyze Jenkins files in more detail.

Proxy and environment variables###

The first block defines a globally available environment variable DOCKER. You can tell it to apply globally because it is inside the pipeline block but outside the stage block. Next is a statement of agent, which means that Jenkins can use any (server) agent.

~/Jenkins Guide/Jenkinsfile

12345 pipeline { environment { DOCKER = credentials('docker-hub') } agent any

The DOCKER definition is completed through the voucher function. This allows you to use confidential login information without including it in the Jenkins file. To configure this key pair:

In the pipeline in this example, DOCKER = credentials(&#39;docker-hub&#39;) creates two environment variables, DOCKER_USER and DOCKER_PWD which can be used to log in to your docker hub account.

Build phase

The first thing you will notice about the parallel code block is that it is self-explanatory-it runs the sub-phases in parallel. This is very useful for building two Docker images using the same shell commands used before. Each image is declared in its own step, which is also part of the independent phase.

~ /jenkins-guide/Jenkinsfile

1 2 3 4 5 6 7 8 910111213141516171819202122 // Building your Test Images stage('BUILD') { parallel { stage('Express Image') { steps { sh 'docker build -f express-image/Dockerfile \ -t nodeapp-dev:trunk .' } } stage('Test-Unit Image') { steps { sh 'docker build -f test-image/Dockerfile \ -t test-image:latest .' } } } post { failure { echo 'This build has failed. See logs for details.' } } }

After closing the parallel phase, you will encounter the post condition. Post means that the definition applies to the entire BUILD phase. In this case, only the failure condition is set, so it will only run if any part of the BUILD phase fails. Configuring the different tools Jenkins provides for communication is beyond the scope of this guide.

Testing phase###

Parallel execution is also used during the test phase:

~ /jenkins-guide/Jenkinsfile

1 2 3 4 5 6 7 8 9101112131415161718192021222324252627282930313233 // Performing Software Tests stage('TEST') { parallel { stage('Mocha Tests') { steps { sh 'docker run --name nodeapp-dev --network="bridge" -d \ -p 9000:9000 nodeapp-dev:trunk' sh 'docker run --name test-image -v $PWD:/JUnit --network="bridge" \ --link=nodeapp-dev -d -p 9001:9000 \ test-image:latest' } } stage('Quality Tests') { steps { sh 'docker login --username $DOCKER_USR --password $DOCKER_PSW' sh 'docker tag nodeapp-dev:trunk /nodeapp-dev:latest' sh 'docker push /nodeapp-dev:latest' } } } post { success { echo 'Build succeeded.' } unstable { echo 'This build returned an unstable status.' } failure { echo 'This build has failed. See logs for details.' } } }

The Mocha Tests stage starts with two images and executes automatic tests, the resulting reports.xml is saved to the Jenkins workspace file. On the other hand, this Quality Tests stage publishes the trunk version of your application to Docker Hub. It first issues a Docker login command (using predefined credentials), then changes the image tag and pushes it.

Again, you have the post code block, but this time it has notifications of successful completion, instability and failure. Remember, you can use any code here, not just notifications.

Deployment phase###

Different types of blocks are introduced at this stage: when. As the name implies, this clause is only executed when a certain condition is met. In the case of this example, the code is only run when a change to the master branch is detected. Submitting to other branches will not trigger this step of the pipeline.

In the step, you can choose to configure the retry and timeout parameters. Our example above shows a nested usage where the timeout of the image building process is 10 minutes, and there are a total of three retries when the timer expires.

The post block is designed to clean up in the event of a failure. No notifications have been set for this stage.

Reporting and cleanup phase###

The last two stages of the pipeline are relatively simple. The junit statement allows Jenkins to use the reports.xml file generated by your Mocha image, and the archiveArtifacts command saves the report and application files to a persistent location. By default, the location is JENKINS_HOME/var/lib/jenkins/jobs/<REPOSITORY> /branches/master/builds/lastStableBuild. If needed, you can configure a custom location in the general settings of Jenkins.

Cooperation with branches###

It is time to submit the complete Jenkins file to the Jenkins server and trigger the running of the new pipeline. In order to test the block discussed earlier in when, the changes will be pushed to different branches.

Configure automatic trigger

You can set Jenkins to scan your repository regularly. To do this, simply click the gear icon on the Pipeline view again, and then click Configuration. There are many options. Look for Scan Repository Trigger, If it is not running, please regularly check this box. You can choose any number of times, for this example, one minute will be chosen.

Test failed (unstable pipeline)

So far, everything should work as expected without errors. But what happens when you encounter an error?

~ /jenkins-guide/app.js

12345 // Web Serverapp.get('/ERROR',function(req,res) { res.json(os);});

The stage of failure###

Now, an error is raised on the BUILD stage.

~ /jenkins-guide/express-image/package.json "dependencies": { "express-ERROR": "^4.13.3" }

  1. Scroll down and check for errors:

  1. Fix the error express-image/package.json.

Merge Pull Requests

Merged the trunk branch to master. This will trigger the operation of the entire pipeline, including the deployment phase:

git checkout master
git merge trunk
git push origin master

Blue ocean dashboard outside##

The Blue Ocean interface is still under development, which means that many aspects of Jenkins are not managed by the new interface. Here are some of the most common screens.

  1. Click the gear icon to enter the repository menu. There, click Status in the left sidebar. You will see your branch and some general information:

  1. If you click the master branch, you will see a more detailed dashboard:

From this view, you can view many useful information, such as logs, artifacts, changes, trends in test results, and so on.

Future Road##

This guide introduces the basic automated workflow of Jenkins and Blue Ocean, but there are many things you can do. Just to name a few possibilities:

More information##

For additional information on this topic, you may want to refer to the following resources. Although these are provided in the hope that they are useful, please note that we cannot guarantee the accuracy or timeliness of externally hosted materials.

To learn more about the tutorial, please go to [Tencent Cloud + Community] (https://cloud.tencent.com/developer?from=10680) to learn more.

Reference: "https://www.linode.com/docs/development/ci/automate-builds-with-jenkins-on-ubuntu/"

Recommended Posts

How to use Jenkins to build automatically on Ubuntu
How to install Jenkins on Ubuntu 16.04
How to install Jenkins on Ubuntu 20.04
How to install Jenkins on Ubuntu 18.04
How to build nfs service on ubuntu16.04
How to use Nginx&#39;s map module on Ubuntu 16.04
How to install and use Docker on Ubuntu 20.04
How to install and use Curl on Ubuntu 18.04
How to install and use Composer on Ubuntu 18.04
How to install and use Wine on Ubuntu 18.04
How to use Docker data volumes on Ubuntu 14.04
How to manage Jenkins with Rancher on Ubuntu 14.04
How to install and use BaasBox on Ubuntu 14.04
How to install and use PostgreSQL on Ubuntu 16.04
How to install and use Docker on Ubuntu 16.04
How to use LVM to manage storage devices on Ubuntu 18.04
How to create and use MongoDB backups on Ubuntu 14.04
How to install and use MySQL Workbench on Ubuntu 18.04
How to install Ruby on Ubuntu 20.04
How to install Memcached on Ubuntu 20.04
How to install Java on Ubuntu 20.04
How to install MySQL on Ubuntu 20.04
How to use hanlp in ubuntu
How to install VirtualBox on Ubuntu 20.04
How to install Elasticsearch on Ubuntu 20.04
How to install Nginx on Ubuntu 20.04
How to install Apache on Ubuntu 20.04
How to install Git on Ubuntu 20.04
How to install Node.js on Ubuntu 16.04
How to install MySQL on Ubuntu 20.04
How to install Vagrant on Ubuntu 20.04
How to install Jenkins on CentOS 8
How to install PostgreSQL on Ubuntu 16.04
How to install Git on Ubuntu 20.04
How to install Memcached on Ubuntu 18.04
How to install MemSQL on Ubuntu 14.04
How to install MongoDB on Ubuntu 16.04
How to install Mailpile on Ubuntu 14.04
How to upgrade to PHP 7 on Ubuntu 14.04
How to install Skype on Ubuntu 20.04
How to install Python 3.8 on Ubuntu 18.04
How to install KVM on Ubuntu 18.04
How to install KVM on Ubuntu 20.04
How to install opencv3.0.0 on ubuntu14.04
How to install Anaconda on Ubuntu 20.04
How to install Apache on Ubuntu 20.04
How to install R on Ubuntu 20.04
How to install Moodle on Ubuntu 16.04
How to install Solr 5.2.1 on Ubuntu 14.04
How to install Teamviewer on Ubuntu 16.04
How to secure Nginx on Ubuntu 14.04
How to install MariaDB on Ubuntu 20.04
How to install Nginx on Ubuntu 20.04
How to install Mono on Ubuntu 20.04
How to install Go on Ubuntu 20.04
How to install Zoom on Ubuntu 20.04
How to uninstall software on Ubuntu
How to install Nginx on Ubuntu 16.04
How to install OpenCV on Ubuntu 20.04
How to install Spotify on Ubuntu 20.04
How to install Postman on Ubuntu 18.04