Tools for Agile Development presentation materials

by Mike Linnen 17. May 2009 18:19

In this post you will find my PowerPoint and source code I used for my presentation at Charlotte Alt.Net May meeting.  I had a good time presenting this to the group even though it was very broad and shallow.  I covered the basis on why you want to leverage tools and practices in a Lean Agile environment.  I got into topics like Source Control, Unit Testing, Mocking, Continuous Integration and Automated UI Testing.  Each of these topics could have been an entire 1 hour presentation on its own. 

Here are the links to the tools that I talked about in the presentation:

Power Point “Tools for Agile Development: A Developer’s Perspective”

NerdDinner solution with MSTests re-written as NUnit Tests and WatiN Automated UI Tests

CI Factory modified solution that I used for creating a CI Build from scratch

Presenting at Charlotte Alt.Net user’s group

by Mike Linnen 4. May 2009 20:41

I will be presenting “Tools for Agile Development: A Developer’s Perspective” at the Charlotte Alt.Net User’s Group May 7th.  Get the details here  Also there will be a second presentation after mine called “PowerShell as a Tools Platform”. 

I better get moving on cleaning up my presentation :)

Using PowerShell in the build process

by Mike Linnen 8. April 2009 15:37

I have used NAnt and MSBuild for years on many projects and I always thought there has to be a better way to script build processes.  Well I took a look at PowerShell and psake and so far I like it.  psake is a PowerShell script that makes breaking up your build script into target tasks very easy.  These tasks can also have dependencies on other tasks.  This allows you to call into the build script requesting a specific task to be built and have the dependant tasks get executed first.  This concept is not anything new to build frameworks but it is a great starting point to use the goodness of PowerShell in a build environment.

You can get psake from the Google Code repository.  I first tried the download link for v0.21 but I had some problems getting it to run my tasks so I went directly to the source and grabbed the tip version (version r27 at the time) of psake.ps1 and my problems went away. 

You can start off by using the default.ps1 script as a basis for your build process.  For a simple build process that I wanted to have for some of my small projects I wanted to be able to do the following:

  • “Clean”the local directory
  • “Compile” the VS 2008 Solution
  • “Test” the nunit tests
  • “Package” the results into a zip for easy xcopy deployment

Here is what I ended up with as a starting point for my default.ps1.


This really doesn't do anything so far except set up some properties for paths and define the target tasks I want to support.  The psake.ps1 script assumes your build script is named default.ps1 unless you pass in another script name as an argument.  Also since the task default is defined in my build script if I don't pass in a target task then the default task is executed which I have set as Test.

Build invoked without any target task:


Build invoked with the target task Package specified:


So now I have the shell of my build so lets get it to compile my Visual Studio 2008 solution.  All I have to do is add the code to the Compile task to launch VS 2008 passing in some command line options.


And here it is in action:


Notice I had to pipe the result of the command line call to “out-null”.  If I didn't do this the call to VS 2008 would run in the background and control would be passed back to my PowerShell script immediately.  I want to be sure that my build script would wait for the compile to complete before it would continue on. 

What about if the compile fails?  As it is right now the build script does not detect that the compile completed successfully or not.  VS 2008 (and previous versions of VS) return an exit code that defines if the compile was successful or not.  If the exit code = 0 then you can assume it was successful.  So all we need to do is test the exit code after the call to VS 2008.  PowerShell makes this easy with the $LastExitCode variable.  Throwing an exception in a task is detected by the psake.ps1 script and stops the build for you.


I placed a syntax error in a source file and when I call the Test target the Compile target fails and the Test target never executes:


Now I want to add in the ability to get the latest code from my source control repository.  Here is were I wanted the ability to support multiple solutions for different source control repositories like Subversion or SourceGear Vault.  But lets get it to work first with Subversion and then later refactor it to support other repositories.  For starters lets simply add the support for getting latest code in the Compile task.


As you can see right now this is very procedural and could certainly use some refactoring, but lets get it to work first and then worry about refactoring.  Here it is in action:


As I mentioned before I want to be able to support multiple source control solutions.  The idea here is something similar to what CI Factory uses.  In CI Factory you have what is known as a Package.  A Package is nothing more than an implementation of a build script problem for a given product.  For example you might have a source control package that uses Subversion and another source control package that uses SourceGear Vault.  You simply include the package you want for the source control product that you are using.  Psake also allows for you to include external scripts in your build process.  Here is how we would change what we have right now to support multiple source control solutions.

So I created a Packages folder under the current folder that my psake.ps1 script resides. I then created a file called SourceControl.SVN.ps1 in the Packages folder that looked like the following:


In the default.ps1 script Compile task I replaced the source control get latest code (told you I was going to refactor it) with a call to the SourceControl.GetLatest function (Line #20).  I also added a call to the psake include function (Line #10) passing in the following “Packages\SourceControl.SVN.ps1”.  Here is what the default.ps1 looks like now:


So if I wanted to support SourceGear Vault I would simply create another package file called SourceControl.Vault.ps1 and place the implementation inside the GetLatest function and change the include statement in the default.ps1 script to reference the vault version of source control.  I plan on adding in support for my Unit Tests the same way I did the source control, that way I can easily have support for multiple unit testing frameworks.


As you can see it is pretty easy to use PowerShell to replace your build process.  This post was just a short introduction on how you might get started and end that crazy XML syntax that has been used for so long in build scripting.  I have a lot more to do on this to make it actually usable for some of my small projects but hopefully I can evolve it into something that will be easy to maintain and reliable.  All in all I think PowerShell has some pretty cool ways of scripting some nice build solutions. 

Charlotte Lean-Agile Open - Tools for Agile development

by Mike Linnen 24. March 2009 09:05

I had a lot of fun presenting at the Lean-Agile Open.  It was a good turnout also.  I think there were at least 15 in my track.  I wanted shout out and thank Guy Beaver from Netobjectives and Ettain group for inviting me to present and making it all happen.  I wish I would of been able to attend Guy's presentation but I was a little bit busy.  Also Steve Collins of Richland County Information Technology gave a powerful presentation on his experiences with Agile.  You could tell he was very passionate about Lean-Agile approaches.


If any of you reading this attended my session I welcome feed back both good and bad, just send it along in an email to  Attached to the end of this post is the slide deck I used in the presentation. 

Here are some links to the tools that I have used and that I spoke about in the presentation:

SourceGear Vault - Source control used by the developers and the build

NUnit - Unit Testing Framework used by the developers for TDD and Unit Tests. Also used in the build.

Test Driven .Net - Visual Studio Add-in used to make doing TDD easier and just launching tests for debug or code coverage purposes.

NCover - Code coverage of unit tests used by the developers and the build.

Rhino Mocks - Mocking our dependencies to make unit testing easier.

CI Factory - This was used to get our build up and running fast.  It includes build script solutions for many different build problems that you might want to solve.  It uses CruiseControl.Net (CCNet) under the hood. 

NAnt - Used to script the build.  If you are using CI Factory or CCNet NAnt is already packaged with these products so there is no need to download it.  The web site is a great resource when it comes time to alter the build.

WatiN - Used for UI Automated testing of web pages.  WatiN was used both by the developers and the build.

Sandcastle - Used by the build to create documentation of our code.

NDepend - Used by the build for static analysis of the code base and dependency graphs.   

IE Developer Tool bar - Internet Explorer add in to analyze web pages. 

Firebug - Firefox addon to analyze web pages.

ScrewTurn Wiki - For documenting release notes and provide a place for customer feedback.

Google Docs - Sprint backlog and burn down charts.


As I stated in my session the above tools are what I used and had good success with but my needs may not be the same as your needs so you owe it to your team to evaluate your own tools based on your own needs.

ToolsForAgile.ppt (1.88 mb)



Adding Twitter Notifications to your build

by Mike Linnen 10. March 2009 21:01

I added twitter posts to the FIRST FRC Field Management System, so that interested parties could get the results of a match in near real time.  Since twitter is focused around sending small messages I thought it would be a great mechanism to notify team members when the status of the build changes.  Most build solutions have a way already to do this, but they come in the form of an email or a custom program that sits in your tray waiting to notify you.  Twitter messages on the other hand can be consumed many different ways (web, twitter client, cell phone etc).  This gives great flexibility on how each team member decides on how he/she want to monitor the build process.  In this blog post I will show you how you can add twitter build notifications to a build process.

First you should get a twitter account so you can tell your team members what account they should follow to get the notifications.  You might want to set up your twitter account as private so you can manage who is allowed to follow.  Also this brings up a good point as you should not send any sensitive data in your build message tweet because the messages are sent across the wire and anyone can intercept them.

Next go get the Yedda Twitter C# Library.  This is a C# wrapper around the Twitter API.  It is very easy to use.  You can use the binary from the project or use the Twitter class that is part of the project.

All build processes that I have used (TFS, NANT, CCNET, and MSBUILD) allow for command line applications to be called from the build script.  So we will use the Twitter.cs class found in the Yedda C# Library in a console application to expose it's capabilities of sending twitter updates.  Go ahead and create the Console application and add the Twitter.cs class to it.  Then in the Program.cs Main method write some code to parse a few command line options to pass along to the Twitter,Update method. 


Example command line call to the executable:

tc -user twitterUserName -password twitterPassword -message "Build Failed to compile"

Example tweet generated from the above command line:


Now all you have to do is put the new console executable in a place on your build box that is accessible by the automated build and change your build script to call it with the right message.  You can make the tweet a little more informative as to why the build failed, or you can have the build tweet at certain key points of the process so you know exactly what step the build is on.  Be creative but don't send too many messages or the team members will soon ignore all build tweets as they end up being annoying.

Possibilities for improvement

  • You could make a twitter client that monitors the build tweets and it parses the message from the build and reacts differently based on if the build failed on a compiler error or a unit test.  Maybe some static analysis failed but it isn't severe enough to grant immediate attention.  The client might attempt to grab the team members attention more if the severity of the message is high enough.
  • What about a twitter client that parses the messages and controls a traffic red light.  Green is build passed.  Red is build failed. Yellow is unit tests failed. (6.46 kb)

New Twitter feed for FIRST FRC 2009 Field Management System

by Mike Linnen 24. February 2009 21:09

I have blogged several times before about my involvement in building the Field Management System that runs the FIRST FRC events.  Each year I have worked very hard with 2 other engineers on trying to build the best possible experience for the volunteers that run the event, teams that participate in the event, and the audience that attends the event.  This year we wanted to extend the experience to beyond those that actually attend the event.  We wanted to have a way to announce the results of the matches as they are happening on the field.  This has been done in the past by updating an HTML web page that gets posted on the FIRST web site.  But we wanted something more that could be used by the teams in their quest for knowledge on what is happening during each event on their device of choice.

So I am very pleased to say that this years event will have twitter updates for each match as they are completed on the field.  All you have to do is follow the FRCFMS twitter account in order to get match updates from all events.  The tweets that are posted follow a specific format that should allow the teams to build really cool applications on top of the twitter data.  Here is an example tweet of our test event:


As you can see in the tweet it is a little hard to read as we are jamming a bunch of information into the 140 character limitation but this should be vary easy to parse the information with a bot of some sort.

The format is defined as follows:

#FRCABC - where ABC is the Event Code.  Each event has a unique code.

TYP X - where x is P for Practice Q for qualification E for Elimination

MCH X - where X is the match number

ST X - where X is A for Autonomous T for Teloperated C for complete

TIM XXX - where XXX is the time left

RFIN XXX - where XXX is the Red Final Score

BFIN XXX - where XXX is the Blue Final Score

RED XXX YYY ZZZ - where XXX is red team 1 number, YYY is red team 2 number, ZZZ is red team 3 number

BLUE XXX YYY ZZZ - where XXX is blue team 1 number, YYY is blue team 2 number, ZZZ is blue team 3 number

RCEL X - where X is the red Super cell count

BCEL X - where X is the blue Super cell count

RROC X - where X is the red rock and red Empty Cell count

BROC X - where X is the blue rock and blue Empty Cell count

There are some cool ways you can use twitter to get the information you want for a specific event.  Hop on over to and enter in the following #FRCTEST TYP Q and you will get a list of all qualifying matches for the TEST event.  When the events start this weekend you can substitute the TEST code with the event code of your choice.  The FIRST FRC Team update has a list of all the valid event codes.

You can also use the with your favorite RSS reader to get updates in RSS format.

If other tweeple are tweeting about the event and using the same hashcode that the Field Management System uses then you can hop on over to #hashtags and enter in the hash code for the event and see all tweets for that event.  For example try navigating to and you will see all the tweets for the #frctest event that we have been running to test the Field Management System.

Although for week one the match tweets will only be at the end of each match, week 2 we are thinking about upping the frequency of these tweets so that you get more of them while the match is in play.  This will make it very difficult for a human to read the tweets on a small device because they will be too many of them coming.  I would like to hear any ones thought on what the frequency of tweets should be and if they expect to be reading the tweets rather than parsing them with another tool.  Of course if you intend to read the tweets and you are only interested in the final match result you could use the advanced search capabilities to only view tweets that have the status of complete.  That search would look something like this:


It will be really cool to see how the information we are posting is going to be used!


Robotics | Software

TFS Build reports partial fail even when all tests passed

by Mike Linnen 8. January 2009 09:33

I ran into a strange issue today when I was trying to figure out why my TFS Build was reporting that the build was partially successful even though every test was passing.  The normal build report really did not give any good reason why it was partially successful other than the fact that it was something related to the unit tests (I am using MS Test in this case).  So I cracked open the build log and peeled through the entries.  I noticed that when the code coverage was attempting to instrument the assemblies it reported that several of the assemblies could not be located.  Then I remembered I did some refactoring and I renamed and consolidated some assemblies. 

Well that should be an easy fix all I had to do was remove these assemblies from the test run config in the Code Coverage section.  So I opened up the LocalTestRun.testrunconfig file in Visual Studio 2008 and selected the Code Coverage section to make my changes.  As soon as I did this the config editor closed down (crashed).  Wow that was weird I never saw that before.  Hmmm I wonder what it could be.  Well here is what I did to try and locate the issue.

  1. Perhaps the Test Run Config file needs to be checked out of source control for write access.  Nope that wasn't it.
  2. Well if I cant edit it in VS 2008 then I might as well try notepad.  I removed the offending assemblies using notepad in the LocalTestRun.testrunconfig.  However once I opened up the Test Run Config editor and selected the Code Coverage the editor still crashed.
  3. Perhaps I malformed the Test Run Config xml file.  So I opened it up again in notepad and the XML looked fine.  Besides if this XML was malformed I don't think the Test Run Config editor would not open at all.
  4. Consult almighty search engine.  Wow look what I found and it was only reported 2 days ago.

So to be sure that I got the Test Run Config file right I removed my Database project from my solution made my edits for Code Coverage in the Test Run Config editor, then added the database project back into the solution. 

After fixing the Test Run Config file my build ran successfully.

Vista Media Center

by Mike Linnen 28. December 2008 02:34

I have wanted to set up a Media Center PC for a long time now and I finally got a chance to do just that this weekend.  I have to say that Vista Media Center has really impressed me.  The flexibility of having a PC that manages many media elements such as pictures, music, and movies and being able to stream that content to multiple devices in the house is awesome.  At this point I don't have my media center PC hooked up to my current broadcasting provider as I do not have a tuner that is capable of receiving the signal.  Not having a tuner is probably the one thing that has kept me so long from setting this up.  However currently I am not disappointed in missing the tuner because there is so much that can be done with media center.

In my house we currently have 4 PCs and an XBOX 360.  I have all of them networked together to gain access to the Internet.  Beyond a small amount of file and printer sharing the network served as an Internet provider.  Well now all that has changed.  With a central machine in place acting as the media center all other Vista PCs have direct access to the same content.  This content is currently music, pictures and recorded video.  It is such a nice thing to be able to stream this content to the XBOX 360. 

I used to think that having a large coax video distribution network throughout the house was a requirement in order to get video from one room to the next.  However video cable routing to each room can be pretty expensive.  Then you have to have the means to be able to control the video source when you are in another room.  Making this solution all Ethernet based is a real nice alternative, especially when you have PC's all over the house anyway.  And the benefits of not limiting the content to video only is really cool too.

We have a pretty large DVD collection as well.  With kids in the house the DVDs always get some abuse.  They soon start skipping or wont work at all.  So I have started saving our collection to the PC in order to preserve the original DVDs.  This works really well with a couple added applications.  First of all you need My Movies 2 in order to manage the collection and extend VMC to make it easy to view the movie on any PC in the house.  My Movies 2 comes as a server and client component.  You only need to install the server part on the master media center.  All other PCs get the client part of My Movies 2My Movies 2 has a really slick install that walks you through modifying the VMC menu options.  Next you need DVDfab in order to rip your collection.  I installed DVDfab on the master Media Center as that is were I will be managing my collection anyway.  Lastly if you don't convert the ripped video files to a well known format for the XBOX 360 you will need another application called Transcode 360.  This product will take the video files and convert them on the fly to a compatible format for the XBOX 360.

I think my next step in getting a great audio visual experience in my house is to get one of the Media Center Extenders and place it in our living room.  I think the extender would provide me with all I need for living room entertainment. 

Beyond that the only thing I would be missing is a way to get my DirecTV recorded programs accessible from VMC.  As I see it there are 2 options for this.  Option 1 would be to get a DirecTV tuner that would go into my PC.  However this currently is not available.  Option 2 is to get my existing DirecTV DVRs on the network so that I can expose the programs to VMC. 

BlogEngine on IIS7

by Mike Linnen 20. December 2008 11:47
I migrated my hosting over to an IIS7 provider and I ran into a couple problems that I thought I should blog about.  Of course I was not the first person that moved to IIS7 and experienced simular issues.  Fortunately that someone blogged about their issues as well and that's what I used to resolve my problems.  So take a look at the CodePlex BlogEngine Discussions for the orignal poster.  I was having the exact same symptoms where my CSS did not work and all links came up with a 404 error.  Once I applied the configuration changes that where mentioned in the discussion I was up and running.

Visual Studio 2008 Database Project problems on Vista/Server 2008 64 bit

by Mike Linnen 19. September 2008 20:31

I recently bought a new HP laptop with 64 bit Vista on it.  This new laptop is going to be my desktop development machine replacement.  I recently experienced several problems with trying to use an existing database project (from another machine) or creating a new one.  Every time the project tried to connect to the DB an error would come up saying a connection could not be made because the instance could not be found.  I did several searches on the Internet and I could not find any solution.  The only guidance I got was to be sure SQL Server 2005 Developer SP2 is installed before trying to use the database project. 

First a little background.  I wanted a dev machine that had SQL Server 2005 developer not SQL Server 2005 Express.  So I first installed SQL Server 2005 Developer and applied SP2.  I then proceeded to install Visual Studio 2008 without selecting the option to install SQL Express.  I also chose to install VS 2008 SP1.  After all this I tried to create a new SQL Server 2005 database project and I walked through the wizard.  Once I got up to the part where the wizard attempted to create the DB I got an error.  The only thing I found to work was to go into visual studio and select Tools -> Options -> Database tools -> Database Connections and I noticed the SQL Server Instance name was SQLEXPRESS.  So I cleared this out as shown below:


There was also an instance name of SQLEXPRESS on the Design Time Validation Database.  So I cleared that as well.


After that my database projects worked fine. 

About the author

Mike Linnen

Software Engineer specializing in Microsoft Technologies

Month List