< Back to articles

CI/CD with Jenkins pipeline on Google Kubernetes Engine

We have almost 50 developers working with different app development technologies to create, test and ship apps for our demanding clients. With multiple git pushes and merge requests per hour there is a need for fast and optimized flow. Automating the CI/CD with multiple technologies and clouds/other deployment targets is critical for us. This is how we use powerful Jenkins Pipeline with Shared Pipeline Library with Jenkins and Gitlab.

Our backend runs mainly on Kubernetes cluster as NodeJS, mongodb, mysql and some php containers as well. Frontend with react, middleman and other technologies need to be tested quickly and deployed to dedicated webservers, Google Storage Buckets or ftp webhosting, depending on the client's needs. Android and iOS team need fast and reliable CI/CD as well. All the teams need a way to build and test merge requests. Above all that, there is always something that needs to be automated in the infrastructure, so having a scripted automating platform for DevOps team is a number one priority.

Jenkins Pipeline

Some time ago, we used Jenkins freestyle jobs to checkout a repo, build it in the console with simple bash snippets and deploy it to some other server with SSH, rsync or something similar.

Then Jenkins Pipeline was released. Jenkins Pipeline is a suite of plugins which supports implementing and integrating continuous delivery pipelines into Jenkins. An alternative to old freestyle jobs.

With Jenkins Pipeline all you need to do is create a Pipeline or Multibranch Pipeline job and specify the Pipeline as a code. The code can be part of the definition of the job or directly as a Jenkinsfile in the project's repository.

Example Jenkinsfile:

#!groovy  
  
node('nodejs') {  
  
    currentBuild.result = "SUCCESS"  
  
    try {  
  
       stage('Checkout'){  
          checkout scm  
       }  
  
       stage('Build'){  
          sh 'npm install'  
       }  
  
       stage('Test'){  
         env.NODE_ENV = "test"  
         sh 'npm run ci-test'  
       }  
  
       stage('Deploy'){  
         sh 'docker build -t myapp . && docker run -d myapp'  
       }  
  
       stage('Cleanup'){  
         sh 'npm prune'  
         sh 'rm node_modules -rf'  
         slackNotify channel: '#ci-nodejs', message: 'pipeline successful: myapp'  
       }  
    }  
    catch (err) {  
  
        currentBuild.result = "FAILURE"  
        slackNotify channel: '#ci-nodejs', message: 'pipeline failed: myapp'  
        throw err  
    }  
  
}

The pipeline has several stages, which are then showed thanks to Visualizer plugin. What it does is - build the node app with npm (sh means shell call), test it, build a docker image and run it, clean up the workspace and notify Slack about the pipeline result. slackNotify is a slack plugin call (the plugins "need" to support pipeline syntax in order to be called as a nice function with a Map of parameters)

jenkins pipeline

Jenkins pipeline is very powerful, you can program it to do pretty much anything you want. Thanks to the fact it is using the Groovy language, you can use Java libraries in your pipeline.

Shared Pipeline library

As we work on a lot of projects we would have a lot of similar code or the exact same Jenkinsfile across all nodejs and other repositories. Imagine a simple change in slackNotify function call syntax or another cli client syntax change would mean changing the pipeline code in all repositories and branches! This actually happened recently with gcloud cli api client syntax - they deprecated the old syntax, changed gcloud docker push to gcloud docker -- push. 

For our use case, we don't want to have the pipeline logic in project repositories. Only thing we want to have in the repository is the pipeline configuration.

This can be easily done using Jenkins Shared Pipeline Library - a Groovy git repository with following folder structure

(root)
+- src                     # Groovy source files
|   +- org
|       +- foo
|           +- Bar.groovy  # for org.foo.Bar class
+- vars
|   +- foo.groovy          # for global 'foo' variable
|   +- foo.txt             # help for 'foo' variable
+- resources               # resource files (external libraries only)
|   +- org
|       +- foo
|           +- bar.json    # static helper data for org.foo.Bar

This shared pipeline repo is checked out every time a Pipeline job starts and you can load and use it in your Jenkinsfile, even specifying a branch:

@Library('my-shared-library@development') _

Or load it implicitly (can be set in Jenkins configuration settings).

Configuration Jenkinsfile

Using the Shared Pipeline Library, the Jenkinsfile in the project's repository can look like this:

PipelineNodejs{  
  
  // MODIFY  
  projectName = 'node-template'  
  slackChannel = '#ci-nodejs'  
  appName = 'api' // microservice name, unique in project  
  cloudProject = [development: 'kube-dev-cluster', master: 'kube-prod-cluster']  
  buildCommand = 'npm install && npm run postinstall'  
  
  // MODIFY ONLY IF YOU KNOW WHAT YOU ARE DOING  
  nodeImage = 'node:5.12.0'  
  nodeEnv = "-e NODE_PATH=./app:./config"  
  nodeTestEnv = '-e NODE_ENV=test -e NODE_PATH=./app:./config'  
  namespace = [development: "${projectName}-development", master: "${projectName}-master"]  
}  

Pipeline example

The Jenkinsfile in the previous section calls a function PipelineNodejs. That is a global function defined in the Shared Pipeline Library in vars/PipelineNodejs.groovy. A small example of the logic is here:

def call(body) {  
  // evaluate the body block, and collect configuration into the object  
  def config = [:]  
  body.resolveStrategy = Closure.DELEGATE_FIRST  
  body.delegate = config  
  body()  
  
  properties([disableConcurrentBuilds()])  
  def agent = config.agent ?: 'nodejs'  
  
  node(agent) {  
  
    def workspace = pwd()  
  
    ... code omitted for brevity ...  
  
    stage('Test') {  
      if (config.runTests){  
  
        docker.image(nodeImage).inside(nodeTestEnv) {  
          sh "npm run ci-test"  
        }  
        echo "npm run ci-test finished. currentBuild.result=${currentBuild.result}"  
  
        ... code omitted for brevity ...  
  
        if (currentBuild.result == 'UNSTABLE') {  
          throw new RuntimeException("Tests failed")  
        }  
      }  
      else {  
        echo 'Tests skipped'  
      }  
... code omitted for brevity ...  

Check other pipelines for Android, iOS, React, middleman and others.

Merge request builder

For the merge request flow with Gitlab we can make a pipeline job that uses the Gitlab integration plugin. This can aslo be done for PRs in Github.

Merge request builder job example for iOS is the following:

env.CHANGELOG_PATH = "outputs/changelog.txt"  
env.SLACK_CHANNEL = "ci-merge-requests"  
env.FASTLANE_SKIP_UPDATE_CHECK = 1  
env.FASTLANE_DISABLE_COLORS = 1  
env.CHANGELOG = ""  
  
node('ios') {  
      
   try {  
         
       gitlabBuilds(builds: ["carthage", "pods", "test", "build ipa"]) {  
             
            def gemfileExists = fileExists 'Gemfile'  
              
            fastlane = "fastlane"  
  
            if (gemfileExists) {  
                fastlane = "bundle exec fastlane"  
            }  
              
            stage('Checkout') {  
                println env.dump()  
                withCredentials([string(credentialsId: 'jenkins-gitlab-credentials', variable: 'credentials')]) {  
                  checkout changelog: true,   
                           poll: true,   
                           scm: [$class: 'GitSCM',   
                                  branches: [[name: "origin/${env.gitlabSourceBranch}"]],   
                                  doGenerateSubmoduleConfigurations: false,   
                                  extensions: [[$class: 'WipeWorkspace'],   
                                  [$class: 'PreBuildMerge',   
                                    options: [fastForwardMode: 'FF',   
                                              mergeRemote: 'origin',   
                                              mergeStrategy: 'default',   
                                              mergeTarget: "${env.gitlabTargetBranch}"]  
                                  ]  
                                 ],   
                           submoduleCfg: [],   
                           userRemoteConfigs: [[name: 'origin', credentialsId: credentials, url: env.gitlabSourceRepoSshUrl ]]]  
                }      
            }  
      
            stage('Prepare') {  
                sh("security unlock -p ${MACHINE_PASSWORD} ~/Library/Keychains/login.keychain")  
                  
                if (gemfileExists) {   
                    sh "bundle install --path ~/.bundle"   
                }  
            }  
          
            gitlabCommitStatus("carthage") {  
                stage('Carthage') {  
                    sh(fastlane + " cart")  
                }  
            }  
             
            gitlabCommitStatus("pods") {  
                stage('Pods') {  
                    sh(fastlane + " pods")  
                }  
            }  
              
            gitlabCommitStatus("test") {  
                stage('Test') {  
                    sh(fastlane + ' test type:unit')  
                    junit allowEmptyResults: true, testResults: 'fastlane/test_output/report.junit'      
                }  
            }  
              
            gitlabCommitStatus("build ipa") {     
                stage('Build IPA') {  
                    sh(fastlane + " beta")  
                }  
            }     
              
        }      
        currentBuild.result = 'SUCCESS'  
    }  
    catch (e) {  
        currentBuild.result = "FAILURE"  
        throw e  
    }  
    finally {  
        notifyBuild(currentBuild.result,reason)  
    }  
}  

Mind the gitlabBuild and gitlabCommitStatus for visualizing the Pipeline in gitlab and to allow/forbid merging the MR in Gitlab. This job can be used for all ios merge requests globally. Set the webhook to this Jenkins job from your repo and you are all done.

Important (and ugly) part is the scm checkout step with merging source and target branch. The env data (env.gitlabSourceBranch and env.gitlabTargetBranch) are parsed from the Gitlab webhook.

You can easily setup rebuilding the merge request when a new push was done to source or target branch. Another cool feature is rerunning the pipeline on a comment in the MR (we use the phrase rebuild pls and Jenkins automatically rebuilds the job!).

Job DSL and seed jobs

Thanks to Job DSL plugin, we can dynamically generate Jenkins Pipeline jobs via Jenkins DSL API.

A job that generates other jobs, is called a seed job. An example for generating the iOS merge request job:

String scriptPath = "jobs/gitlab"  
String jobSuffix = "merge-request-builder"  
String mCommentTrigger = "rebuild pls"  
  
def gitlabOn = {  
  it / 'properties' / 'com.dabsquared.gitlabjenkins.connection.GitLabConnectionProperty' {  
    'gitLabConnection'('gitlab')  
  }  
}  
  
def platform = 'ios'  
def jobName = "$platform-$jobSuffix"  
  
pipelineJob(jobName) {  
  concurrentBuild(false)  
  configure gitlabOn  
  definition {  
    cps {  
      script(readFileFromWorkspace("${scriptPath}/${jobName}.groovy"))  
      sandbox()  
    }  
  }  
  
  triggers {  
    gitlabPush {  
      buildOnMergeRequestEvents(true)  
      buildOnPushEvents(false)  
      enableCiSkip(true)  
      setBuildDescription(true)  
      commentTrigger(mCommentTrigger)  
      rebuildOpenMergeRequest('both')  
      skipWorkInProgressMergeRequest(false)  
    }  
  }  
}  

Testing production pipeline

Now we sort of need to build, test and deploy our pipeline. CI/CD for the CI/CD Pipeline. Yo dawg, I heard you like Inception.

Place a simple Jenkinsfile in the Shared Pipeline Library and use it to seed the jobs in the previous chapter.

// use a Pipeline class src/cz/ackee/Pipeline.groovy rather than the global func  
def pipeline = new cz.ackee.Pipeline()  
  
node {  
  
  properties([  
    disableConcurrentBuilds()  
  ])  
  
  pipeline.checkoutScm()  
  
  // set additional envvars and config  
  pipeline.setEnv()  
  
  stage('seed jobs') {  
    withCredentials([string(credentialsId: 'jenkins-gitlab-credentials', variable: 'gitlabCredentials')]) {  
      jobDsl targets: 'jobs/**/seed.groovy',  
             additionalParameters: [credentials: gitlabCredentials]  
    }  
  }  
  
  stage('test') {  
    // run basic tests on pipeline like mysql reporting, slack, gcloud, kubectl and gitlab integrations checking    
    pipeline.envTest()  
   
    // test Node.js Pipeline  
    def obj = build job: '../node-template/master/', propagate: false  
    if(obj.result == 'FAILURE') throw new RuntimeException("Node.js Pipeline test failed.")  
  
    // test React Pipeline  
    obj = build job: '../react-template/master', propagate: false  
    if(obj.result == 'FAILURE') throw new RuntimeException("React Pipeline test failed.")  
  
    // test Middleman Pipeline  
    obj = build job: '../middleman-template/master', propagate: false  
    if(obj.result == 'FAILURE') throw new RuntimeException("Middleman Pipeline test failed.")  
  
   }  
  
  stage('lint') {  
    //TODO: add groovy lint  
  }  
}  

Other fun stuff you could do with Shared Pipeline Library

Anything you want. Really.

Links

Check other pipelines for NodejsAndroid, iOS, React, middleman and others.

https://github.com/AckeeDevOps/jenkins-pipeline-library

Did you find this article helpful? Learn more about our app development!

Marek Bartík
Marek Bartík
DevOps Engineer

Are you interested in working together? Let’s discuss it in person!

Get in touch >