The long way through Software Craftsmanship

Tool: Building a local pipeline

Oct 10, 2015 - 3 minute read - Comments - polish-your-tooltoolbuilding-pipelinepipelineautomationworking-directorytrapgitgit-hookpipes-and-filtersrepositorysignal-trappingtriggergrowlnotification

Motivation

At a client, one of the projects has a long building process and the tests are mostly slow, so I use a local building pipeline, an example of the Pipes and Filters pattern.

This allows for executing manually only the fast unit tests, then automatically (no user intervention, no time spent) executing the rest of them before pushing. In case the latter fails, it is possible to do git push -f to the pipeline without corrupting the central repository (origin) history, possibly disturbing others.

This strategy also allows for parallel modification of sources: you can continue working on your IDE while the compiler is working on the other working directory. Should you introduce any syntax / logical error on your working code, the compiler is not affected, as it has a working copy just for itself.

Implementation

This requires two git repositories:

  • local or working copy. Configure it so it has a remote called pipeline and the origin, the repository you cloned from. This is a non-bare repository.
  • pipeline, used for building. This also is a non-bare repository.

In the local you can do the development and local commits.

When you’re done, instead of

git push origin $branch

do

git push pipeline $branch

After the git hook is installed, this will trigger the pipeline execution.

Git hook

In the pipeline, in the .git/hooks/post-receive file:

#!/bin/bash
chmod +x pipeline.sh
while read oldrev newrev refname
do
  # whatever you want to execute
   branch=$(git rev-parse --symbolic --abbrev-ref $refname)
  ./pipeline.sh $branch
done

In the above script, we’re telling git to execute the pipeline.sh with the received branch as argument.

Pipeline executor

In the pipeline repository, in .git/pipeline.sh file:

#!/bin/bash

set -e
set -o pipefail

function cleanup {
  git checkout develop
  git pull develop
}

# upon failure, tell the user
function err {
  cleanup
  growlnotify "pipeline fails"
}

# trap signal ERR, executing function 'err'
trap "err" ERR

branch=$1

if [[ -z $branch ]]; then
    echo "need to specify a branch"
    exit -1
fi

git checkout $branch
mvn clean install | tee output.log
git push --set-upstream origin $branch
cleanup

Pipeline executor explanation

  • We prepare the bash environment:

    • -e: fail the script when a command fails
    • -o pipefail: fail the script when some command fails in a pipe
    • err and trap: create a hook to be executed when the signal is trapped
  • We require a branch to execute this script.

  • Checkout to that branch

  • Clean, compile & execute tests

  • Push to origin

  • Clean up

  • In case this fails, the script will stop and notify the user with growl and clean up.

Clean up: checkout to develop (or any other branch that always exists), leaving the system prepared to execute again.

Notifications: telling the user

The program growlnotify is a CLI notifier to growl (windows, linux)

Conclusions

Ideally, the tests should be faster, and executing them locally should always be possible, maybe in the pre-commit hook. Whenever this is not possible, a local pipeline can reduce the time spent waiting for test execution and remove the lock on the working directory while the compiler is working.

This pipeline aims to be simple, without many customizations and being single-user. For more complex workflows and other restrictions, it might be better to drop this project and start investigating continuous integration (CI) tools such as e.g., Jenkins, Travis, Bamboo

Further work

The jobs in the pipeline could be queued, so it is possible to push to the pipeline before the previous job has started. See reference below.

Reference

Appendix

This script will push to the pipeline:

#!/bin/bash

git push pipeline  2>&1 > /dev/null &