spacer

Tools

Tweet
spacer spacer Permalink

Software Builds and the Virtual Time Machine

By John Graham-Cumming, January 23, 2008

If you are not yet using virtualization in your build environment, then it's time to get moving.
John is founder of Electric Cloud, which focuses on improving software production processes. He can be contacted at jgc@electric-cloud.com


If you are not yet using virtualization in your build environment, then it's time to get moving. Applying virtual machine technology to a build system, just as many have done in the test environment, means being able to give instant answers to questions like "Can you give me the log files for build XYZ?", "Which system headers were used for project ABC?" and it means always being able to say "Yes" when someone asks for a rebuild of an ancient software version.

Fundamentally, software build management is about the process of getting all the right software components—a particular version of the source, the specific tools (such as compilers and linkers) and the right third-party code (such as system libraries)—on to a machine with the right operating system and running the build script.

Once the build script has run, the object code generated has to be extracted, source code tagged, and precious log files and other output saved for later use. Frequently, a complete record of the configuration of the build machine is needed (including tool, library, and OS versions) as well as a bill-of-materials listing all the source code that went into the build.

Complete, detailed records are necessary because the build machine's configuration will inevitably change as time goes by and other builds are performed on it.

But what if that were not the case? What if every time the build finished, the entire machine was placed in a vault in case the build had to be reproduced? Each time the build manager performed a build, they'd have to go out and buy a new machine.

Sounds ridiculous? Yes. But wouldn't it be great?

There would be no problem reproducing an old build when an important customer demands a bug fix, or a security headache means rereleasing old code. No problem grepping an old log file. No problem asking questions about what exactly went into a build. No need to guess what the configuration was, or sweat trying to reproduce an old build.

Happily, virtual machines turn physical hardware into files. And like any other file they can be backed up, versioned, and reloaded when needed.

If builds are performed on a virtual machine, that machine can be saved and tagged with each build. Or a virtual machine "snapshot" can be taken and tagged with the build number. You could even check the snapshot into version control when the build is tagged.

Now the configuration of an old build is on hand at any time. Just fire up the right VM to go back in time. You'll be taken right back to the moment the build finished, with the complete machine state available, and a cursor flashing at a shell prompt ready for the next command.

But keeping the complete state is not the only advantage of virtualization. Old operating systems can be made to run on new hardware by virtualizing the interfaces that the OS sees. It's no longer necessary to keep ancient hardware around—and working—if that Windows 98 VM can run on the latest virtualized Intel box.

Virtual machines also mean that pristine configurations are always available. The ultimate "make clean" is booting from a newly initialized virtual machine. A repository of configurations can be built and run on any machine available to the build team. The problem of maintaining specific machines for specific builds goes from being a hardware issue to a software one.

And the same hardware can be used to support different OS configurations, making more efficient use of build infrastructure.

All these changes mean that software production management products need to understand virtualization and help the build team manage virtual assets. If the management system is virtualization-aware, then it can control the entire process of starting the right virtual machines, getting sources, running builds, archiving output, and saving virtual machine snapshots.

The latest software production management products (including my company's ElectricCommander) integrate with the classic suite of build tools (such as SCM, scripting languages, make or ant tools, and test systems) and have been enhanced to be virtualization-aware.

In short, virtualization makes more efficient use of existing build hardware, simplifies the management of different build configurations, and gives the build manager a virtual time machine capable of taking them back to the moment when any given build was completed.


Related Reading

  • News
  • Commentary

News

  • Hosted TFS Server 2012 From Dynamsoft
  • Forget Cross Platform, Try Massively Cross Platform
  • Windows Phone 8 Developer Platform Is Here
  • Zend To Support Mobile First, Cloud First Development
  • More News»

Commentary

  • Toyota's Kanban Is Still Driving Agile
  • Splunk Enterprise 5 Arrives
  • Big Data Was A Data Play, Now It's A Developer Play
  • Creating and Using Libraries with OpenACC
  • More Commentary»
  • Most Popular
  • On the Web

Most Popular

  • The Enduring Challenge of Compressing Random Data
  • volatile: The Multithreaded Programmer's Best Friend
  • Microsoft Build Details The C++ State Of The Nation
  • Read/Write Properties Files in Java
  • More Popular»

On the Web

  • Cloud Connect 2012 Speaker List: Mat Ellis Founder and CEO, Cloudability
  • Real Bonding With Family Around the TV Via Skype
  • NIST, DHS trying to tame cyber workforce definitions
  • Automated EMR Search Speeds ED Evaluations
  • More On The Web»
  • Slideshow
  • Video

Slideshow

  • Jolt Awards: Utilities
  • Dr. Dobb's 2012 Salary Survey
  • Developer's Reading List
  • A Look at the Main JVM Languages Today
  • More Slideshows»

Video

  • End-To-End Developer Testing
  • NASA's Mars Curiosity
  • Atlassian Summit
  • Former Hacker: Software Should Be More Secure
  • More Videos»

More Insights

White Papers

  • Big Data Analytics Guide 2012
  • Managing threats in the digital age
More >>

Reports

  • Research: Federal IT Priorities: Focus On The Foundation
  • Research: App Dev in the Age of Mobility
More >>

Webcasts

  • The Latest in Advanced Product Personalization
  • Which data compression technology should you choose? An Edison perspective
More >>

INFO-LINK
gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.