Jenkins is an open source automation server that allows you to build pipelines to automate the process of building, testing, and deploying applications. In this guide, you will implement basic workflows to speed up the continuous integration and continuous delivery (CI/CD) process.
sudo apt-get update && sudo apt-get upgrade
Note that this tutorial is written for non-root users. Commands that require elevated privileges are prefixed with
sudo
.
This guide is aimed at DevOps professionals and therefore assumes:
Before automating the workflow, it is necessary to understand the basic CI/CD process. The following figure illustrates this:
The most basic process consists of three stages: build, test, and deploy. Every time a change is made on the distributed version control system, an automation loop is triggered on the Jenkins server. The complete set of instructions for running the process Jenkinsfile
is located in the root directory of the source repository. This single file tells the server what to do, when and how to perform these tasks.
As mentioned in the previous section, the automated process first submits to the version control system.
Create a new repository in GitHub. This guide will use a simple Node.js application to demonstrate how the Jenkins pipeline works. Select the corresponding .gitignore
, and don't forget to initialize it with the following content README
:
Clone the new repository to the local workstation:
git clone [email protected]:<GITHUB_USERNAME>/jenkins-guide.git
Open your favorite text editor and create the file app.js
in the root directory of the repository. Add the following content:
~ /jenkins-guide/app.js
1 2 3 4 5 6 7 8 9101112131415161718192021 | ' use strict';const express = require('express');const app = express();// Server connectionconst PORT = 9000;const HOST = '0.0.0.0';// Application contentconst os = ['Windows','macOS','Linux']// Web Serverapp.get('/',function(req,res) { res.json(os);});// Console outputapp.listen(PORT, HOST);console.log(Running on http://${HOST}:${PORT} ); |
---|
This application uses the Express web server to provide a single JSON output to the browser on port 9000. Next, save test.js
to the same location in the root of the repository.
~ /jenkins-guide/ test.js
1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829 | var supertest = require("supertest");var should = require("should");var server = supertest.agent("http://nodeapp-dev:9000");// Unit Testdescribe("Webapp Status",function(){ // Test 1 - HTTP status it("Expect HTTP status 200",function(done){ server .get("/") .expect("Content-type",/text/) .expect(200) .end(function(err,res){ res.status.should.equal(200); done(); }); }); // Test 2 - Control Tests it("Mocha Control Test",function(done){ (1).should.be.exactly(1).and.be.a.Number(); done(); });}); |
---|
This is a simplified test suite using supertest
and should
. It only has two tests: the first one checks the HTTP status, and it is expected to be 200. The second one is not a real test, but a control that always passes.
This example will use two Docker containers, one for app.js
using Express and the other for the test suite using Mocha. Each image has its own folder, which contains the corresponding Dockerfile
and package.json
.
Dockerfile
and package.json
as express-image
. ~/jenkins-guide/express-image/DockerfileFROM node:6-alpine
# Create a server directory
RUN mkdir -p /home/node/app
WORKDIR /home/node/app
# Install server dependencies
COPY /express-image/package.json /home/node/app
RUN npm install
# Copy node Application
COPY app.js /home/node/app
# Open port
EXPOSE 9000
CMD ["npm","start"]
App.js
runs this image by default when it starts. You can think of it as the "dockerized" version of the web application.
package.json
copies the files in the root of the project directory to the new image: ~/jenkins-guide/express-image/package.json{" name":"express-image","version":"1.0.0","description":"Example Node Application","author":"Your name","repository":{"type":"git","url":"git+https://github.com/<YOUR_USERNAME>/<REPOSITORY_NAME>.git"},"license":"ISC","scripts":{"start":"node app.js"},"dependencies":{"express":"^4.13.3"}}
Dockerfile
as test-image
: ~/jenkins-guide/test-image/DockerfileFROM node:6-alpine
# Create feports directory
RUN mkdir -p /JUnit
# Create a server directory
RUN mkdir -p /home/node/tests
WORKDIR /home/node/tests
# Install app dependencies
COPY /test-image/package.json /home/node/tests
RUN npm install
# Copy test source
COPY test.js /home/node/tests
EXPOSE 9000
CMD ["npm","test"]
This image creates a reports folder and installs the dependency package.json
from it. In the beginning, it performs Mocha testing.
This JSON file contains all the required dependencies, including mocha-junit-reporter
Jenkins will be used to test the storage required dependencies. Please note that the test script is configured with the option Dockerfile
for mochaFile
to use the image report folder specified in the image. Your final project distribution will be similar to:
**Note: **The method of folder structure and the implementation of two Docker containers are unusual, but for teaching reasons they are used to demonstrate Jenkins Pipeline functionality.
Before starting the real automation process, you first need to understand the content to be automated.
nodeapp-dev
container first. The flag --network
is used to avoid conflicts with other container networks. Note that port 9000 is open, and the -d
flag is used to run it in detached mode. Once started, you can open the browser and enter the address: http://localhost:9000
to check. sudo docker run --name nodeapp-dev --network="bridge" -d -p 9000:9000 nodeapp-dev:trunktest-image
container. --link
In order to communicate with it, it is very important to use the same network and flag nodeapp-dev
. You will notice that the container's report folder JUnit
will be installed in the current repository root directory. This is a necessary condition for reports.xml
to be written on the host. Run it in interactive mode with the -it
flag to output the result to stdout
. sudo docker run --name test-image -v $PWD:/JUnit --network="bridge" --link=nodeapp-dev -it -p 9001:9000 test-image:latest npm run mochasudo -i
) and run it again in detached mode to test the JUnit
output. This file should be saved after reports.xml
. sudo docker rm -f test-image sudo docker run --name test-image -v $PWD:/JUnit --network="bridge" --link=nodeapp-dev -d -p 9001:9000 test-image:latestsudo -i
If necessary, stop using both containers. sudo docker stop test-image nodeapp-devYou have just completed the entire build, test and deploy process of this fictional web application. Now it's time to automate.
Jenkins provides many installation options:
jenkins.war
. This is a fast and effective solution that can be used with Jenkins, requires very few prerequisites, but is more difficult to maintain and update.Using packages maintained by the Jenkins project allows you to use a newer version than the version included in the distribution package manager.
sources.list
: sudo sh -c'echo deb http://pkg.jenkins.io/debian-stable binary/> /etc/apt/sources.list.d/jenkins. list'sudo service jenkins
and select start
, stop
, restart
, or status
. Start your service to check the installation: sudo service jenkins startjenkins
user to the Docker group, you technically grant it root
permissions.Jenkins provides two different options for the Jenkinsfile
syntax:
Both support continuous delivery and Jenkins plugins. The script syntax is based on the Groovy programming environment, so it is more complete. On the other hand, the declarative grammar "was created to provide a simpler and more insightful grammar to author Jenkins pipelines", so it is suitable for daily automated construction. You can learn more about syntax comparison in Jenkins Documentation.
This guide will use Declarative syntax to illustrate the Jenkins process, because its design is easier to implement and understand.
The declarative pipeline syntax is very intuitive. The most basic layout is similar to the layout shown below:
pipeline
: All files should start with this statement at the top. It represents the beginning of a new pipeline. agent
: defines the working environment, usually a Docker image. The any
statement indicates that the pipeline can use any available proxy. stages
: This block is a collection of stage
instructions. stage
: group one or more steps
. You can use as many stages as you need, and this is very useful when you are working in a complex model that requires detailed debugging of "each stage". steps
: Here you define your actions. A stage can group many steps, and each step is usually linked to a specific task/command.The code blocks are separated by curly braces ({
and }
) and do not use semicolons. Each statement must be on its own line, and Jenkinsfile
is the core of the steps you perform. Some common steps are:
All these operations can be performed internally, the agent or you can instruct Jenkins to perform any operations remotely via SSH. As you can see, there are endless possibilities for automation. In a simple scenario, only one pipeline that executes its stages sequentially is sufficient to achieve the desired final state, but you can define the pipeline to run in parallel when needed. For more information about Jenkins's declarative pipeline syntax, please refer to Official Document.
Jenkinsfile
is created first in the directory of the jenkins-guide
workstation. This is just a template, but it contains all the code needed to start the pipeline: pipeline {agent any stages {stage('Build') {steps {echo'This is the Build Stage'}} stage('Test') {steps { echo'This is the Testing Stage'}} stage('Deploy') {steps {echo'This is the Deploy Stage'}}}}From here, you can get the following valuable information: 1) your build number, 2) the console output of each step, 3) select the stage for further analysis, 4) browse the tabs, which contains information about submitting changes, testing Information about results and stored artifacts, 5) replay your build, 6) visually edit the pipeline, 7) go to your pipeline settings.
The Jenkinsfile
template uses a very basic pipeline structure with only three stages. You can customize it to suit multiple stages as needed. The final pipeline structure is determined by the complexity of the project and the development guidelines you must follow. Now that you know the Node.js example, you know how to design a pipeline that automates each stage. For the purposes of this guide, the final pipeline should:
Establishment phase
If you encounter errors, create two images and abort any further testing or deployment.
If a malfunction occurs, please notify the corresponding department.
Test phase
Execute automated Mocha test suite.
Publish the nodeapp-dev
image for easy distribution and manual quality testing.
According to the result of the automatic test, the corresponding department is notified: successful, unstable (any automatic test failure) or complete failure of the stage.
Deployment phase
It will only run when the commit is performed on the master
branch and the test phase is successfully completed.
Change image tags before publishing.
Deploy the dockerized application to Docker Hub.
Save the compressed "golden" image for further distribution.
Report phase
Save the JUnit
file and perform detailed analysis in reports.xml
.
Save the compressed image of nodeapp-prod-golden.tar.gz
to a persistent location.
Cleanup phase
Stop all containers.
Pruning system.
Clean up the Jenkins workspace.
First edit the Jenkinsfile and paste the following pipeline. Replace<DockerHub Username>
For your own information.
~/Jenkins Guide/Jenkinsfile
|1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99100101| pipeline { environment { DOCKER = credentials('docker-hub') } agent any stages {// Building your Test Images stage('BUILD') { parallel { stage('Express Image') { steps { sh 'docker build -f express-image/Dockerfile \ -t nodeapp-dev:trunk .' } } stage('Test-Unit Image') { steps { sh 'docker build -f test-image/Dockerfile \ -t test-image:latest .' } } } post { failure { echo 'This build has failed. See logs for details.' } } }// Performing Software Tests stage('TEST') { parallel { stage('Mocha Tests') { steps { sh 'docker run --name nodeapp-dev --network="bridge" -d \ -p 9000:9000 nodeapp-dev:trunk' sh 'docker run --name test-image -v $PWD:/JUnit --network="bridge" \ --link=nodeapp-dev -d -p 9001:9000 \ test-image:latest' } } stage('Quality Tests') { steps { sh 'docker login --username $DOCKER_USR --password $DOCKER_PSW' sh 'docker tag nodeapp-dev:trunk
|--------|--------|
This complete Jenkinsfile is written using declarative syntax. If you read it carefully, you will notice that it describes the same process used during application deployment in the previous section. This section will analyze Jenkins files in more detail.
The first block defines a globally available environment variable DOCKER
. You can tell it to apply globally because it is inside the pipeline block but outside the stage block. Next is a statement of agent
, which means that Jenkins can use any (server) agent.
~/Jenkins Guide/Jenkinsfile
12345 | pipeline { environment { DOCKER = credentials('docker-hub') } agent any |
---|
The DOCKER
definition is completed through the voucher function. This allows you to use confidential login information without including it in the Jenkins file. To configure this key pair:
docker-hub
. After you save the credentials, you can use them anywhere in the pipeline.In the pipeline in this example, DOCKER = credentials('docker-hub')
creates two environment variables, DOCKER_USER
and DOCKER_PWD
which can be used to log in to your docker hub account.
The first thing you will notice about the parallel
code block is that it is self-explanatory-it runs the sub-phases in parallel. This is very useful for building two Docker images using the same shell commands used before. Each image is declared in its own step, which is also part of the independent phase.
~ /jenkins-guide/Jenkinsfile
1 2 3 4 5 6 7 8 910111213141516171819202122 | // Building your Test Images stage('BUILD') { parallel { stage('Express Image') { steps { sh 'docker build -f express-image/Dockerfile \ -t nodeapp-dev:trunk .' } } stage('Test-Unit Image') { steps { sh 'docker build -f test-image/Dockerfile \ -t test-image:latest .' } } } post { failure { echo 'This build has failed. See logs for details.' } } } |
---|
After closing the parallel phase, you will encounter the post
condition. Post
means that the definition applies to the entire BUILD
phase. In this case, only the failure
condition is set, so it will only run if any part of the BUILD
phase fails. Configuring the different tools Jenkins provides for communication is beyond the scope of this guide.
Parallel execution is also used during the test phase:
~ /jenkins-guide/Jenkinsfile
1 2 3 4 5 6 7 8 9101112131415161718192021222324252627282930313233 | // Performing Software Tests stage('TEST') { parallel { stage('Mocha Tests') { steps { sh 'docker run --name nodeapp-dev --network="bridge" -d \ -p 9000:9000 nodeapp-dev:trunk' sh 'docker run --name test-image -v $PWD:/JUnit --network="bridge" \ --link=nodeapp-dev -d -p 9001:9000 \ test-image:latest' } } stage('Quality Tests') { steps { sh 'docker login --username $DOCKER_USR --password $DOCKER_PSW' sh 'docker tag nodeapp-dev:trunk |
---|
The Mocha Tests
stage starts with two images and executes automatic tests, the resulting reports.xml
is saved to the Jenkins workspace file. On the other hand, this Quality Tests
stage publishes the trunk
version of your application to Docker Hub. It first issues a Docker login command (using predefined credentials), then changes the image tag and pushes it.
Again, you have the post
code block, but this time it has notifications of successful completion, instability and failure. Remember, you can use any code here, not just notifications.
Different types of blocks are introduced at this stage: when
. As the name implies, this clause is only executed when a certain condition is met. In the case of this example, the code is only run when a change to the master branch is detected. Submitting to other branches will not trigger this step of the pipeline.
In the step, you can choose to configure the retry
and timeout
parameters. Our example above shows a nested usage where the timeout of the image building process is 10 minutes, and there are a total of three retries when the timer expires.
The post
block is designed to clean up in the event of a failure. No notifications have been set for this stage.
The last two stages of the pipeline are relatively simple. The junit
statement allows Jenkins to use the reports.xml
file generated by your Mocha image, and the archiveArtifacts
command saves the report and application files to a persistent location. By default, the location is JENKINS_HOME/var/lib/jenkins/jobs/<REPOSITORY> /branches/master/builds/lastStableBuild
. If needed, you can configure a custom location in the general settings of Jenkins.
It is time to submit the complete Jenkins file to the Jenkins server and trigger the running of the new pipeline. In order to test the block discussed earlier in when
, the changes will be pushed to different branches.
DEPLOY
skipped the stage, which is expected.You can set Jenkins to scan your repository regularly. To do this, simply click the gear icon on the Pipeline view again, and then click Configuration. There are many options. Look for Scan Repository Trigger, If it is not running, please regularly check this box. You can choose any number of times, for this example, one minute will be chosen.
So far, everything should work as expected without errors. But what happens when you encounter an error?
app.js
is edited on the local workstation. On the server, change the root address /
with /ERROR
. This will cause an error 404 (page not found) on the express
server, so the test will fail.~ /jenkins-guide/app.js
12345 | // Web Serverapp.get('/ERROR',function(req,res) { res.json(os);}); |
---|
app.js
file and save it.Now, an error is raised on the BUILD
stage.
express-image/package.json
. Change the name of the Express package to express-ERROR
to simulate error input.~ /jenkins-guide/express-image/package.json "dependencies": { "express-ERROR": "^4.13.3" }
BUILD
stage in the pipeline view, and then click Shell Script to see the console output:express-image/package.json
.Merged the trunk
branch to master
. This will trigger the operation of the entire pipeline, including the deployment phase:
git checkout master
git merge trunk
git push origin master
The Blue Ocean interface is still under development, which means that many aspects of Jenkins are not managed by the new interface. Here are some of the most common screens.
master
branch, you will see a more detailed dashboard:From this view, you can view many useful information, such as logs, artifacts, changes, trends in test results, and so on.
This guide introduces the basic automated workflow of Jenkins and Blue Ocean, but there are many things you can do. Just to name a few possibilities:
post
(or any other part) can benefit from useful built-in features such as email, slack, or HipChat notifications. As usual, you can decide what to trigger the notification, successful build, failed build, change or custom conditions.stages
for different `agents, such as one for database tasks, one for compiling code, one for webapp update, etc.For additional information on this topic, you may want to refer to the following resources. Although these are provided in the hope that they are useful, please note that we cannot guarantee the accuracy or timeliness of externally hosted materials.
To learn more about the tutorial, please go to [Tencent Cloud + Community] (https://cloud.tencent.com/developer?from=10680) to learn more.
Reference: "https://www.linode.com/docs/development/ci/automate-builds-with-jenkins-on-ubuntu/"
Recommended Posts