In my previous post we learned how to build RPM packages of your software for multiple versions of your target distribution(s). Now I want to present a way of automating the build process and building packages on/for all target platforms. You should have a look at the openSUSE build service to see if it already fits your needs. Then you can stop reading here :-).
We needed better control over the platforms and the process, so we setup a build farm based on the Jenkins continuous integration (CI) server ourselves. The big picture consists of the following components:
- build slaves allowing a jenkins user to do unattended builds of the packages
- Jenkins continuous integration server using matrix builds with build slaves for each target platform
- build script orchestrating the build of all our self-maintained packages
- jenkins job to deploy the packages to our RPM repository
Preparing the build slaves
Standard installations of openSUSE need some minor tweaks so they can be used as Jenkins build slaves doing unattended RPM package builds. Here are the changes we needed to make it work properly:
- Add a user account for the builds, e.g.
useradd -m -d /home/jenkins jenkins and setup a password with
sshd configuration to allow password authentication and restart sshd.
- We will link the
SPECS directories of
/usr/src/packages to the working copy of our repository, so we need to delete the existing directories:
rm -r /usr/src/packages/SPECS /usr/src/packages/SOURCES /usr/src/packages/RPMS /usr/src/packages/SRPMS.
- Allow non-priviledged users to work with
chmod -R o+rwx /usr/src/packages.
- Copy the ssh public key for our git repository to the build account in
- Test ssh access on the slave as our build user with
ssh -v git@repository. With this step we confirm the host authenticity one time so that future public key ssh interactions work unattended!
- Configure git identity on the slave with
git config --global user.name "jenkins@build###-$$"; git config --global user.email "email@example.com".
- Add privileges for the build user needed for our build process in
jenkins ALL = (root) NOPASSWD:/usr/bin/zypper,/bin/rpm
Configuring the build slaves
Linux build slaves over ssh are quite easily configured using Jenkins’ web interface. We add labels denoting the distribution release and architecture to be easily able to setup our matrix builds. Then we setup our matrix build as a new job with the usual parameters for source code management (in our case git) etc.
Our configuration matrix has the two axes Architecture and OpenSuseRelease and uses the labels of the build slaves. Our only build step here is calling the script orchestrating the build of our rpm packages.
Putting together the build script
Our build script essentially sets up a clean environment, builds package after package installing build prerequisites if needed. We use small utility functions (functions.sh) for building a package, installing packages from repository, installing freshly built packages and removing installed RPM. The script contains roughly the following phases:
- Figure out some quirks about the environment, e.g. openSUSE release number or architecture to build.
- Clean the environment by removing previously installed self-built packages.
- Setting up the build environment, e.g. linking folder from
/usr/src/packages to our working copy or installing compilers, headers and the like.
- Building the packages and installing them locally if they are a dependency of packages yet to be built.
Here is a shortened example of our build script:
if [ "i686" = `uname -m` ]
SUSE_RELEASE=`cat /etc/SuSE-release | sed '/^[openSUSE|CODENAME]/d' | sed 's/VERSION =//g' | tr -d '[:blank:]' | sed 's/\.//g'`
# setup build environment
# force a repository refresh without checking the signature
sudo zypper -n --no-gpg-checks refresh -f OUR_REPO
# remove previously built and installed packages
# install needed tools
if [ $SUSE_RELEASE -lt 121 ]
buildAndInstallRPM omniNotify2 $ARCH
Deploying our packages via Jenkins
We setup a second Jenkins job to deploy successfully built RPM packages to our internal repository. We use the Copy Artifacts plugin to fetch the rpms from our build job and put them into a directory like
all_rpms. Then we add a build step to execute a script like this:
for i in suse-12.1 suse-11.4 suse-11.3
rm -rf $i
mkdir -p $i
versionlabel=`echo $i | sed 's/[-\.]//g'`
cp -r "all_rpms/Architecture=32bit,OpenSuseRelease=$versionlabel/RPMS" $i
cp -r "all_rpms/Architecture=64bit,OpenSuseRelease=$versionlabel/RPMS" $i
cp -r "all_rpms/Architecture=64bit,OpenSuseRelease=$versionlabel/SRPMS" $i
rsync -e "ssh" -avz $i/* firstname.lastname@example.org:/srv/www/htdocs/OUR_REPO/$i/
ssh email@example.com "createrepo /srv/www/htdocs/OUR_REPO/$i/RPMS"
With a setup like this we can perform an automatic build of all our RPM packages on several targetplatform everytime we update one of the packages. After a successful build we can deploy our new packages to our RPM repository making them available for our whole organisation. There is an initial amount of work to be done but the rewards are easy, unattended package updates with deployment just one button click away.