< Back to articles

Is Jenkins still the only choice in 2019? What about Gitlab CI?

As we’ve shared in our previous DevOps posts, we mainly use Jenkins for our common CI and CD tasks. Nowadays, Jenkins is still industry standard, there are heaps of resources tutorials and Stackoverflow threads about (almost) every conceivable issue. I’ve written “almost” back there, I know. Well, sometimes you have to dive in to Java code and figure out what the hell that XYZ plugin actually does. And things can get messy. But enough of complaints.

So why are we experimenting with another CI / CD tool when we are pretty happy with Jenkins? Because that’s the nature of all technology enthusiasts! That’s the thing which moves the world forward! And this is who we are – app development enthusiasts.

A bit from the history

Our Jenkins toolchain is built around the shared Jenkins library with thousands lines of groovy code. The main idea is write the parameterized pipeline once and then just call it from Jenkinsfile like this:

PipelineNodejs{  
  projectName = 'project123'  
  slackChannel = '#ci-project123'  
  appName = 'api'  
  cloudProject = [development: 'project1', master: 'project3', stage: 'project2']  
  buildCommand = 'npm install'  
  
  nodeImage = 'node:8.3.0'  
  nodeEnv = "-e NODE_PATH=./app:./config"  
  nodeTestEnv = '-e NODE_ENV=test -e NODE_PATH=./app:./config'  
  namespace = [development: "${projectName}-development", stage: 'stage', master: 'production']  
}  

That’s a really great way how to Don’t Repeat Yourself. But codebase of each project served by this pipeline is evolving and it's happening unequally so at the certain point of time you’ll be implementing new features which can’t endanger processes in the less progressive apps. This triggers the never ending if/else purgatory for sake of backward compatibility.

Also, Jenkins does not natively support specification of pipelines for Merge Request in the same file where you have your delivery / deployment recipe. We solve this gap with Merge Request builders which handle Merge Requests separately in the separate Jenkins jobs. It definitely does the job but it's somehow broken.

Now, let’s check some stuff I really love about Gitlab CI!

Pipeline as a code

The previous stuff might look like pipeline as a code. Well, not really in my mind. When I think about pipeline as a code – I think about something that can show the full logic at the first sight. This ultimately means that pipeline specification belongs to the repository, right next to the main application code!

When you have such pipeline specification, you can finely tune each project separately. Does the new project require some special testing methodology? Implement it straight in the repository. Is someone challenging the overall pipeline process? Review together stuff relevant for the certain project, not the ocean of “if this is 10 years old version of nodejs, then ... ”

Gitlab CI is effectively able to handle multiple kinds of pipelines for different events. In one single file. For example, you can specify set of jobs for push events which handle delivery of your app and you can also specify different set of jobs for Merge Requests.

test:mr:  
  image: node:10.14.2  
  script:  
    - npm install  
    - npm run ci-test  
  only: ["merge_requests"]  
  
build:  
  image: docker:18.09.1  
  script:  
    - docker build -t my-image .  
  only:  
    variables:  
      - $CI_PIPELINE_SOURCE == "push"  
    refs: ["master", "stage", "development"]  

Again, everything happens in the same file: .gitlab-ci.yml. You can also extend your pipeline with external content so you can keep it nice and tidy and reuse some parts across several projects.

Support of container-native CI / CD

Before I start with the container-native stuff, I'd like to ask you a question. Do you want to execute all your applications (written in different programing languages!) from different eras in the same environment?

Of course not.

Let’s face it, such stuff tends to interfere and that’s not good. You want to close your application to the unique environment which will perfectly fit your expectations. Exact nodejs version, exact glibc version, whatever. Also, you want your environment as disposable as possible so older builds won’t mess all the future builds with weirdly cached modules and so on. And this is something which is nowadays perfectly doable with containerization.

And guess what, Gitlab CI is able to handle container-native workflows natively, hassle free! ?

We can go even further, containerization opens endless possibilities when it comes to the extension of the main functionality with plugins. You can write whatever in any language you want, envelope it to the Docker image and that’s it. Well, it’s not conventional view of plugins, but on the other hand this manner can’t break your CI / CD tool and you don’t have to study super-complicated APIs.

During my experiments I've made a few "plugins." Helmer-gke is responsible for GKE deployments with Helm, docker-gcr builds docker containers with the main application code and uploads them to the Google Container Registry and aglio-uploader is responsible for rendering and uploading of API documentation. I must say that implementation of these bits was really fast and enjoyable! Also, implemented functionality can be tested immediately even without integration to the final pipeline.

Integration

And last but not least, CI / CD tools must behave user-friendly. Personally, I don't feel it as an important criterion, however, I’m not the main consumer of CI / CD tools. These tools are mainly for developers and release managers. Also, believe me or not, an excellent developer does not have to be an excellent Linux/Unix superuser and automation guru. Hence CI / CD tools must look cool and they should offer a good observability for all mankind.

And this is a topic where no one really can't beat Gitlab CI. When you work with Gitlab / Gitlab CI, it’s like to have control panel of whole universe in front of you.

gitlab pipeline screenshot
gitlab stages overview screenshot
gitlab production environment screenshot
gitlab merge requests screenshot

All these points above represent the current state of Gitlab CI. It's a really enjoyable experience even though Gitlab CI does not have all functionality we use in good ‘ol Jenkins. For example, it can’t draw nice coverage nor unit tests graphs. Also, it causes some cultural issues since… you know, Gitlab CI is not Jenkins hence it does not solve the same things the same way (no offence).

Personally, I strongly believe that tools which you work with should always bring joy to your life (i.e. working with them should not remind Stockholm syndrome). And that's Gitlab CI. I'm always like "damn, it's so easy here!"

Is Jenkins still the only choice?

If I had been asked the same question five or ten years ago, I would have answered: yes it is, the other tools don't provide features we need and they also lack the community.

But today it's a totally different story. There's a handful of tools with steep learning curve which can cover all your needs. And Gitlab CI really excels here since it supports all three major platforms (Linux, MacOS and Windows), has detailed documentation and it's dead simple.

I’m not saying that Gitlab CI will replace Jenkins here in Ackee, however it will definitely become the subject to deeper exploration! And I will be the main explorer. Gladly.

Did you find this article helpful? Learn more about our app development!

Štěpán Vraný
Štěpán Vraný
DevOps Engineer

Are you interested in working together? Let’s discuss it in person!

Get in touch >