Aug 4, 2009

creating a codeswarm movie

code swarm frame

Download video (3Mb)

A codeswarm is a visualization of the activity within a source code repository. The image and linked video above shows the lifetime of one of Verilab's source repositories. You can see code being created, the check-ins as they happen and an indication of which users are doing the work at any given time. It is an example of an 'organic information visualization' and is created using the Processing toolkit. The original visualization tools were developed by Michael Ogawa and the source code is available on Google code.

In this particular case I created the animation on OS X 10.5, using a combination of codeswarm, ffmpeg and LAME. If you are interested in doing something like this yourself:

First you'll need to make sure you have a recent version of the Java Development Kit installed (JDK 1.5 or later). You'll also need a recent version of Ant installed. (I have version 1.7.0, which ships with OS X as default). Download the code_swarm source and install it. Then execute 'ant run'. If all is well, you should get a dialog box prompting you for the source repository, user name and password.

At this point, I put in the svn+ssh URL for the Verilab repository that I wanted to visualize. Everything fell over, with a Java error (NoClassDefFoundError within com/trilead/ssh2). From this I realised I needed to install the SSH libraries for Java, from Trilead. I downloaded those, unpacked them and added the jar file to my CLASSPATH. Along the way I found out the default OS X CLASSPATH definition is in /System/Library/Java/JavaConfig.plist which may be useful as a starting point.

With that fixed, I again ran 'ant run' and put in the relevant information. A bit of time passes as the checkin information is extracted from the repository, then the visualisation runs. You'll find that repository information that was extracted is saved, under the ./data directory (look for the latest realtime_sample.*.xml file) . This is useful for the next stages, as you don't have to fetch the information again. If you want to create a video of the visualisation, there are a few more hoops to jump through.

You will need to configure codeswarm to save the frames for each stage of the visualisation. You do this by editing the ./data/sample.config file. First off, copy it to a new version for your particular project. Then edit these values:

  • InputFile= [Point it at the new realtime_sample<number>.xml file in the data directory, that contains the checkin information for your project]
  • TakeSnapshots=true

That's all you really need to change. You can also change the other values, to alter the visualisation. The ColorAssignX= statements use regexp values to differentiate different types of checkin and colour code them accordingly. Play around with the other values, with TakeSnapshots set to false and re-run the visualisation until you get something you are satisfied with. Then run one more time with TakeSnapshots=true to save off the frame images. You can run with the new configuration by running 'ant run data/your_project.config'

After running with TakeSnapshots enabled, you'll have a set of images in the ./frames directory, (controlled by the SnapshotLocation option in the config file). The final step is to assemble those into a movie. The easiest way I found to do this is to use the command-line utility, ffmpeg. There are a variety of ways to install ffmpeg, but the simplest way seems to be to install ffmpegX and then extract the binary from the application bundle. You can also get it using Fink or MacPorts. If you want to use an audio track with your visualisation, you will also probably require LAME. With ffmpeg working, it is simple to point it towards the image files from codeswarm and produce the final movie. The finishing touch was adding some music from an mp3 file, then limiting the duration via the -t switch, to end when the video frames ran out, rather than playing all of the music.

ffmpeg -i frames/code_swarm-%05d.png -i 6_sym.mp3 -qmax 15 -t 100 -f image2 -r 24 <output_filename>.mpg

You can run 'ffmpeg' without any switches to get help on the options. If all goes well, you should end up with an MPEG format video in the file <output_filename>.mpg.

There are comments.

iPhone development

iPhone Simulator

Interested in learning about iPhone development? Want to study at Stanford? Don't want to pay the tuition fees? On iTunesU (the lecture streaming part of iTunes) you can follow along with class cs193p from Stanford, on iPhone Application Development. In addition to the good quality video of the lectures, all of the class slides, handouts and assignments are available, for free. If you have an Intel Mac, you can also download the development tools, iPhone SDK and a simulator, again all free. If you do want to actually develop and test applications on an iPhone or iPod Touch, you'll need to pay the $99 developer fee to get the encryption keys that let you run applications on a phone and allows you to submit apps for the app store. At least for the basics, the simulator is useful as a target platform for testing, although there are differences between it and the final platform. (features such as multi-touch and the accelerometer are hard to test for example, unless you want to start shaking your computer).

Lifehacker recently had an article on all of the educational resources that are becoming available on the web, for free. iTunesU is a good example of the sort of teaching resources that are out there, if you look. The quality is variable, but there are some excellent resources if you are prepared to dig.

There are comments.