tl;dr: When you are using Gerrit and Jenkins on the same machine, know what you're doing!
In a recent project we decided to increase code quality by introducing Gerrit as Code Review Tool.
The configuration looks as follows:
Next to a colleague who reviews the patchset, we created a dedicated Jenkins job which verfies the patchset by building the project with the usual maven build configuration “mvn clean install” on the same machine. Only when both the reviewer and the ci server accept the patchset, it will be merged into our git repository.
After a successful merge of the patchset another jenkins job is triggered for deployment purpose.
That job is not surprisingly configured with “ mvn clean install -U”.
Meaning Jenkins is cleaning up the working directory and building the project by using the newest snapshots and/or releases.
Last days we encounterd a problem with our setup. Surprisingly projects faild to build with the unexpected reason of incorrect usage of code in an artifact which in the meanwhile wasn't changed. There have been changesets in Gerrit but since they haven't been reviewed and merged yet, they should not be in the artifact used by other projects.
So whats going on right here?
Analysing the setup we came across the usage of the “-U” Parameter of Maven. The Manual says:
Forces a check for updated releases and snapshots on remote repositories
At the first glance it seems to be what we want our Jenkins job to do. Checking for the newest dependencies before building a project and deploying it into our repository. But in combination with the Gerrit Jenkins job runnning on the same server, which verifies every patchset pushed to Gerrit we introduced an epic flaw.
The install plugin of maven puts every built artifact into the local repository which by definition is of course the newest artifact you can get. So every project using this dependency will take that artifact, even when configured with the “-U” parameter, which only checks if the artifact in the remote repository is newer. The attentive reader knows why it is not.
So whats the solution?
There are three possibilities to overcome the flaw:
Of course you may use dedicated server for both Jenkins and Gerrit. Not sharing the local repository avoids getting in trouble with artifacts, which are temporary and not ready for public usage.
Not only because of the costs, also the higher administrative effort might be a reason to look for other solutions.
Maven ships the goal dependency:purge-local-repository within the maven-dependency-plugin, allowing you to remove all dependencies from the local maven repository. Configured in the process-sources phase it would solve the problem in our case. That solution kind of protects your project from using dirty artifacts.
Howerver this unfortunately removes the symptoms, but not the cause.
There is an other solution which is easier than you might thought. Just configure the Gerrit Jenkins job with “mvn clean package”. This is what we actually want that job to do. It verifies the patchset by building the project without putting that temporary and half-baked version of the artifact into the local repository.
Don't forget to initially clean up the local repository if you switch from 'install' to 'package', as there still might be an unwanted version of the artifact.
Let me point out the conclusion in three simple bullet points:
- Know your artifact lifecycle and its relevance as dependency
- Be careful with different tools running on the same machine sharing resources
- Use Gerrit! Beside of our fail in the configuration it for sure increased our code quality and distributed knowledge of the codebase in our team
Did you have similar problems with that setup? Or other solutions? Don't hesitate to comment your experience.