I recently got a new job and one of the things I did on my very first day was set up a Continuous Integration server. Most of them support similar features but I chose Jenkins because it seems like the most popular open source choice. Jenkins was a breeze to work with. Just download it, install it, start it from the command line, and point your web browser at it. You can add new projects, configure it, download new plugins, and even install it as a Windows service right from the web page. Easy-peasy.
The first thing I did was automate the build process for the project I inherited. Just a batch file that told MsBuild to rebuild the entire solution.
After that I set up a very basic integration job for the project I inherited. Initially the Jenkins job just compiled the project but after a couple days I added a file with two solution-wide msbuild targets: one for the developers that calls code inspection tools and prints the results to the console and one for the integration server that does the same code inspection then dumps the results to an xml file. Once you download the Violations plugin and configure your Jenkins job, it will show a nice chart of how many code inspection warnings you have over time. Let me tell you this right now: it's nice to have a physical measurement of how your changes affect the quality of the codebase. Real nice. For our first code inspection tool I chose the standard in the .NET world: FxCop. FxCop can't tell if the overall architecture or design is good or bad but it has an extensive library of rules regarding more minor (although still important) things like following the naming conventions, disposing IDisposable instances, avoiding unnecessary casts, and many other "in the small" best practices. There are a couple I disagree with (like making a method static if it doesn't affect any instance fields) but it's easy enough to disable those rules. FxCop won't make a bad product great, but if you set aside time to fix a few warnings each day, it can nudge you toward cleaner code.
After a couple weeks I was able to start writing some unit tests and add those to the integration build. The NUnit plugin for Jenkins will take the results and make a few nice charts that show your current test runs as well as test runs over time. Again, charting progress over time really gives these things more meaning.
The second thing I did with Jenkins involves the deployment process. We don't deploy directly to production but we do have a nightly deployment process for the project I'm working on. We make a setup.msi file, zip it, send it to the client, and send an email of the changes. It takes several minutes to do by hand and it's easy to forget to update the product version, forget what changes are being deployed, or not notice what time it is and have to do it all when you realize it's time to go home. That's why the deployment (even if it's just deploying for testing) should be automated and that's where the real fun comes in. It took a few days to get it worked out but the entire process can now be done by an MsBuild target and here's what it does:
- Update the msi settings. A custom inline task reads a file, increments the ProductVersion, sets a new Guid for the ProductCode, then saves over the original vdproj file and updates source control. I later found out that many other people have had to do similar things.
- Build the solution. Do a full clean then rebuild the msi and all it's dependancies with devenv since msbuild doesn't like the vdproj project type.
- Zip the resulting setup files with today's date in the filename using the MsBuild Community Tasks.
- Copy the file to a shared folder.
- Update PivotalTracker. We've been trying out PivotalTracker and it's working so well for us that I can't recommend it enough. Not only is it incredibly easy to use and not only does it provide just enough project management to get the job done without getting in the way, but it's dead-simple to integrate with external tools. We have another custom inline task to use PivotalTracker's awesome web api to change every story with a status of FINISHED to DELIVERED. The response from PivotalTracker is a list of the stories that were delivered. Anyone looking at our project page in PivotalTracker will soon see what has been delivered and is ready for review.
- Parse the results. Using another Community Task, get the names of all the stories that were just delivered. As usual, someone on StackOverflow ran into the same problem I did and got a great answer.
- Send out a notification. Email me a link to the newly shared file with a list of the names of the stories that were delivered. I then tweak the email and resend it to the people involved in reviewing the changes.
It's not perfect and there's a lot more that could be done (like more unit tests, other code inspection tools, and making our build tasks shared across multiple projects) but even while getting familiar with a new codebase and researching the various parts to this process, I managed to improve the project codebase and process while still providing value to the customer. And that, in my experience, is the real double win of relying on a Continuous Integration server:
- Quick feedback that, when diligently followed, provides value to the company by making the codebase better and better every day.
- Powerful tools that, when properly used, automate technical chores so we can spend more time providing value to the customer.
Greetings;
ReplyDeleteWe found your post as the only example of anyone hitting Deliver in pivotal tracker from Jenkins. We were just about to implement the same thing ourselves; if you'd be willing to share, in whatever state, we'd be delighted and of course happy to send back any improvements etc. My email is joshua at flywheel com; thanks!
Everything you do in PivotalTracker can be done through calling urls. Our Jenkins job calls a build script that calls these urls. It's actually pretty easy.
Deletehttp://www.pivotaltracker.com/help/api?version=v3#deliver_all_finished
Hello mmate great blog
ReplyDelete