Category Archives: Jenkins

Jenkins Gitlab Hook Plugin reorganized

The Jenkins Gitlab Hook Plugin received a major refactoring. The goal was to separate concerns from existing modules and to make the project testable. Github repo now contains Java binaries needed to run the rspec tests, but hopefully you’ll find the new organisation a bit more intent revealing and easier to follow.

I’ve used the use case approach, and extracted related services so now all the domain knowledge is contained within models sub folders:

the_new_structure

The remaining models in the root models folder are all directly Jenkins related and left there so Jenkins can load them first and register the plugin and the related web hook correctly.

The entire domain knowledge is now also testable. I chose the rspec to run the tests and have created the related spec helped that loads all the Java dependencies and models from the root folder. To run the specs, you’ll need to setup JRuby so it runs in Ruby 1.9 compatibility mode. Just add the following switches to your JRUBY_OPTS environment variable: –1.9 -Xcext.enabled=true -X+0.

The v1.0.0 release has all the goodies, so feel free to upgrade your Jenkins environments.

Tagged , ,

Jenkins Gitlab Hook plugin updates

There have been a few changes to the Jenkins and the related Ruby runtime as of late. This has caused a few issues with the Gitlab Jenkins Hook plugin which have finally been resolved.

It is recommended that you upgrade to the latest plugin version v0.12.2 and Jenkins to the latest available version if possible. Otherwise, I would stay away from Jenkins v1.519 to v1.521 and the plugin version v0.2.7 to v0.2.11. If you are not experiencing any issues currently, that’s OK, this is related only to those that want some part of the system upgraded for whatever reason.

Also, the upgrade is recommended if you have any of these symptoms:

  • Failed to load HAML message – problem with Ruby Runtime on windows, details in issue #9
  • Failed to install the plugin – problem with Ruby Runtime and Jenkins v1.519 and  v1.520, details in issues #10#11, #12 and #13
  • Undefined method ‘getDefaultParametersValues’ – method gone private in Jenkins, details in issue #14
  • Build no longer triggering – plugin was not building non parametrized Jenkins projects, details in issue #15

  • Case insensitive repo URL matching

 

 

Tagged , ,

Parametrized Jenkins releases from non master branches

Related to Jenkins & Git branches post, if you want to make it play nicely with M2 Release plugin, just configure the Jenkins project to checkout / merge the code to a local branch that has the same name as the branch that is currently being built, like this:

checkout_to_branch

This will enable the release plugin to work even with non master branches.

You can then start the release build like as usual:

m2 release with parameters

Tagged , ,

NodeJS Jenkins Integration using Maven

Jenkins projects at my current workplace are really heterogeneous, e.g. we’re using TFS and Git for SCM’s, building on CentOS and Windows alike, from Java to .Net projects. Lately, NodeJS was added to the mix. There is a variety of approaches for building NodeJS projects in Jenkins out there, but none of them really fit our case. The reasons why:

  • Packaged application must be published to local Artifactory
    • This is part of our regular CI flow
    • All artifacts must reside here
  • No package managers allowed on production servers
    • When deploying, the entire application should be deployable as is, one package picked up from Artifactory
    • OS dependencies are already present during application deploy, all others should be contained in the application itself
  • Managing private NodeJS dependencies
    • Java / Maven projects already natively support this, so the idea was to keep the flow as close to this one as possible
    • When Artifactory starts supporting NPM repositories, this might change
  • Must use Maven Release plugin to well, perform releases
    • On Jenkins, the idea is to use release new versions using this plugin

Some of these are imposed by our own work flow which may not fit yours, but it seems to me that the final solution is good and applicable to other ecosystems as well.

Since Maven was needed in the flow, both for managing dependencies and for performing releases from Jenkins, we needed to reconcile Maven project definition in pom.xml with NodeJS structure. The pom was required to support the following actions (within Maven life-cycle and Jenkins too):

  • Install test and run-time dependencies
    • They are needed on developer or Jenkins build machine
    • And are installed to the project’s node_modules folder
  • Install private dependencies
    • From other in-house projects
  • Create packages for deployment
    • To be published to Artifactory from Jenkins
  • Run tests
    • In Jenkins readable format

Installing test and run-time dependencies is performed within Ant task:

<execution>
  <id>compile</id>
  <phase>compile</phase>
  <configuration>
    <tasks>
      <echo message="========== installing public dependencies ===================" />
      <exec executable="npm" dir="${project.basedir}" failonerror="true">
        <arg value="install" />
      </exec>
    </tasks>
  </configuration>
  <goals>
    <goal>run</goal>
  </goals>
</execution>

So essentially, “npm install” is executed. At this time, using npm is OK since developer machines or CI build machines have internet access and can pull all those dependencies easily.

Private dependencies are a bit different. They reside in Artifactory and can’t be npm installed. Hence, we need to download them from Artifactory and unpack in the same folder as other dependencies. For that, Ant task can be used, but Maven already has an appropriate Maven Dependency Plugin which we’ll use here:

<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-dependency-plugin</artifactId>
  <version>2.6</version>
  <executions>
    <execution>
      <id>unpack</id>
      <phase>compile</phase>
      <goals>
        <goal>unpack-dependencies</goal>
      </goals>
      <configuration>
        <outputDirectory>${project.basedir}/node_modules</outputDirectory>
        <overWriteReleases>false</overWriteReleases>
        <overWriteSnapshots>true</overWriteSnapshots>
        <useSubDirectoryPerArtifact>true</useSubDirectoryPerArtifact>
        <includeGroupIds>org.acme.test</includeGroupIds>
        <stripVersion>true</stripVersion>
      </configuration>
    </execution>
  </executions>
</plugin>

Packaging the project to be published on Artifactory for NodeJS i actually very easy. It consists from archiving all the code and that’s it 🙂 Here, node_modules is already filled with run-time and private dependencies (using the previous steps) and are referenced in application code so we could just pack the entire work-space and be done with it. Still, having all those files seemed a bit odd, so in the end I filtered out the not needed folders. Maven Assembly Plugin is used for that and the task looks like this:

<!-- assembly execution in pom.xml -->
<execution>
  <id>make-zip-assembly</id>
  <phase>package</phase>
  <goals>
    <goal>single</goal>
  </goals>
  <configuration>
    <finalName>${project.name}-${project.version}</finalName>
    <appendAssemblyId>false</appendAssemblyId>
    <descriptors>
      <descriptor>assembly-zip.xml</descriptor>
    </descriptors>
  </configuration>
</execution>

<!-- assembly definition in assembly-zip.xml -->
<assembly>
  <id>zip</id>
  <formats>
    <format>zip</format>
  </formats>
  <includeBaseDirectory>false</includeBaseDirectory>
  <fileSets>
      <fileSet>
        <directory>.</directory>
        <outputDirectory>/</outputDirectory>
        <excludes>
          <exclude>test/**</exclude>
          <exclude>target/**</exclude>
          <exclude>assembly*.xml</exclude>
          <exclude>pom.xml</exclude>
        </excludes>
      </fileSet>
  </fileSets>
</assembly>

The last step is to run the tests. Tests should also be JUnit compatible, so they can be published on Jenkins, and so that Jenkins can fail the build if tests don’t pass. For this example, I’ve been using nodeunit, but it can be easily applicable to any other testing framework you might be using. The only requirement is that the framework exports JUnit compatible results. Nodeunit supports this by passing in the “–reporter junit” parameter. Ant is used again:

<execution>
  <id>test</id>
  <phase>test</phase>
  <configuration>
    <tasks>
      <echo message="========== running tests with JUnit compatible results ===================" />
      <exec executable="nodeunit/bin/nodeunit" dir="${project.basedir}" failonerror="false">
        <arg value="--reporter" />
        <arg value="junit" />
        <arg value="test/" />
        <arg value="--output" />
        <arg value="target/failsafe-reports" />
      </exec>
    </tasks>
  </configuration>
  <goals>
      <goal>run</goal>
  </goals>
</execution>

One more trick was needed in this Ant task to make tests visible to Jenkins. The output folder for the test results is “target/failsafe-reports”. If you don’t put them here, Jenkins will not pick up the test results.

This completes the Maven pom.xml configuration and the project is ready to be built on Jenkins. There, you just need to create a Maven build project and set it up as usual. The setup should include publishing created artifacts to Artifactory, Maven should perform “clean install” or similar goal that includes tests and of course the usual SCM (Git or whatever you are using) details. As you can see below, tests are executed and recognized, and releasing from Jenkins also works:

NodeJS Jekins build history NodeJS Jekins test history

I’ve described only the main / deployable project configuration, but common dependency projects can be configured in the same manner. All you need to take care is the versioning of the artifacts since NodeJS standard versioning has nothing to do with the way Maven takes care of that. I guess one could also tweak NodeJS project version to be read from pom.xml, so everything stays dry. But, that is a tale for some other time.

Complete pom.xml configurations for deployable and depdency projects can be found here and here, so feel free to use in your projects. That’s it folks 🙂

Tagged , , ,

Jenkins & Git branches

Jenkins CI is well known open source continuous integration server, and a damn good one in my opinion. I guess the biggest issue is to get to know all the plugins available, a fun time indeed 🙂 Anyways, since the switch to Git/Gitlab, I needed a setup that would enable the team to use the CI environment in full. The idea was to allow CI environment to build all the branches, not just master (release, develop, whichever is your flavor of the day) branches. Manually setting up Jenkins projects for all the Git branches was out of question.

A little background first. The projects are mainly Java / Maven and there are a lot of dependencies. One rule that is most important for Jenkins environment was that the developers need to keep their tasks / features in separate Git branches. This would prevent clashes between developers, but still allow them to work through the entire stack if needed. Jenkins was to be used as continuous feature testing environment so all those branches had to find their place in the CI stack too.

Ideally, the solution was to satisfy the following requirements:

  • a single branch that spans several projects should be built and referenced correctly (Maven dependencies)
  • only master (develop) and releases branches are to be published on Artifactory, (short lived) features should not fill it up
  • preferably a single Jenkins project, because of:
    • resources when building on the same machine (e.g. building several branches for the same base project)
    • no need to clutter the views
    • build history should show the branch built
  • should be able to build the entire feature stack across different slave nodes

So, we introduced the parameterized builds. Each Git project must have a single Jenkins project that is configured like this:

  • This build is parameterized checked, with a single parameter “BRANCH_NAME_TO_BUILD”, and the default value “master” (or “develop” or what ever you use).

  • Block build when upstream project is building checked – this is not really related to this workflow but is a good practice nevertheless, prevents building projects while dependencies are built
  • Git repositories and branches set to track the Git repository, and to build branch from the above parameters setup. All using Git plugin.

  • Deploy artifacts to Artifactory set to filter out snapshots

  • Deploy artifacts to Maven repository

A specific thing about the setup, that might not work for you, is the Artifactory / Maven part. The policy is that only release artifacts are allowed in Artifactory. This reduces the noise and keeps the Artifactory slick (it is getting rather big anyway). The problem is with building dependencies for feature branches that are always snapshots. For this to work, and to be able to use different nodes, you still need something like Artifactory. If building on a single node, you can just put “install” as a Maven goal and you’d get the dependency on that computer. For multiple nodes, one idea was to tell all the nodes about each others maven repositories, but that seamed like too much maintenance.

So, the workaround was to create a webdav Apache folder that Jenkins could use to put all builds into. And, that same repository was referenced in Maven settings on each of the nodes, and on all of the developers machines too. This enabled Jenkins to “know” about all the feature branches artifacts, while not putting too much strain on Artifactory. And, that Maven repository can be cleaned periodically without peril.

This setup is pretty much it. With it you get to build a specified branch at any time. The feature branches build nicely throughout the entire stack and you have a single project on Jenkins that prevents concurrent builds of the same code base (so resource issue is no longer valid).

Still, nothing is perfect and there are a few gotchas:

  • if you are using some wall plugin, you can’t really tell the status of the project since they show only the last build status, which can be feature or not – this can be a good thing if you decide to treat broken feature builds as a bad thing 🙂
  • you get to build only one branch at the time, for many commits on the same project, you could wait a long time for Jenkins to build your commit
  • Jenkins can’t really decide correctly on Maven dependencies since at one point in time a project might reference some feature snapshot (from pom.xml) whilst at other time it might reference the release version on the same project

The last point is the most problematic one. It will prevent Maven to correctly compute project dependencies which will directly influence the upstream/downstream build triggers in Jenkins. You’ll probably have to manually start builds for such situations or just commit changes again. If developers in your team are a bit disciplined, this might not be an issue after all. A nice idea on how to avoid this problem is to create a separate Jenkins project for each branch, automatically as noted here. I am in the process of enabling this support for Gitlab Jenkins plugin so stay tuned 🙂

Tagged , , ,