Using PowerShell in the build process

by Mike Linnen 8. April 2009 15:37

I have used NAnt and MSBuild for years on many projects and I always thought there has to be a better way to script build processes.  Well I took a look at PowerShell and psake and so far I like it.  psake is a PowerShell script that makes breaking up your build script into target tasks very easy.  These tasks can also have dependencies on other tasks.  This allows you to call into the build script requesting a specific task to be built and have the dependant tasks get executed first.  This concept is not anything new to build frameworks but it is a great starting point to use the goodness of PowerShell in a build environment.

You can get psake from the Google Code repository.  I first tried the download link for v0.21 but I had some problems getting it to run my tasks so I went directly to the source and grabbed the tip version (version r27 at the time) of psake.ps1 and my problems went away. 

You can start off by using the default.ps1 script as a basis for your build process.  For a simple build process that I wanted to have for some of my small projects I wanted to be able to do the following:

  • “Clean”the local directory
  • “Compile” the VS 2008 Solution
  • “Test” the nunit tests
  • “Package” the results into a zip for easy xcopy deployment

Here is what I ended up with as a starting point for my default.ps1.

image

This really doesn't do anything so far except set up some properties for paths and define the target tasks I want to support.  The psake.ps1 script assumes your build script is named default.ps1 unless you pass in another script name as an argument.  Also since the task default is defined in my build script if I don't pass in a target task then the default task is executed which I have set as Test.

Build invoked without any target task:

image

Build invoked with the target task Package specified:

image

So now I have the shell of my build so lets get it to compile my Visual Studio 2008 solution.  All I have to do is add the code to the Compile task to launch VS 2008 passing in some command line options.

image

And here it is in action:

image

Notice I had to pipe the result of the command line call to “out-null”.  If I didn't do this the call to VS 2008 would run in the background and control would be passed back to my PowerShell script immediately.  I want to be sure that my build script would wait for the compile to complete before it would continue on. 

What about if the compile fails?  As it is right now the build script does not detect that the compile completed successfully or not.  VS 2008 (and previous versions of VS) return an exit code that defines if the compile was successful or not.  If the exit code = 0 then you can assume it was successful.  So all we need to do is test the exit code after the call to VS 2008.  PowerShell makes this easy with the $LastExitCode variable.  Throwing an exception in a task is detected by the psake.ps1 script and stops the build for you.

image

I placed a syntax error in a source file and when I call the Test target the Compile target fails and the Test target never executes:

image

Now I want to add in the ability to get the latest code from my source control repository.  Here is were I wanted the ability to support multiple solutions for different source control repositories like Subversion or SourceGear Vault.  But lets get it to work first with Subversion and then later refactor it to support other repositories.  For starters lets simply add the support for getting latest code in the Compile task.

image

As you can see right now this is very procedural and could certainly use some refactoring, but lets get it to work first and then worry about refactoring.  Here it is in action:

image

As I mentioned before I want to be able to support multiple source control solutions.  The idea here is something similar to what CI Factory uses.  In CI Factory you have what is known as a Package.  A Package is nothing more than an implementation of a build script problem for a given product.  For example you might have a source control package that uses Subversion and another source control package that uses SourceGear Vault.  You simply include the package you want for the source control product that you are using.  Psake also allows for you to include external scripts in your build process.  Here is how we would change what we have right now to support multiple source control solutions.

So I created a Packages folder under the current folder that my psake.ps1 script resides. I then created a file called SourceControl.SVN.ps1 in the Packages folder that looked like the following:

image

In the default.ps1 script Compile task I replaced the source control get latest code (told you I was going to refactor it) with a call to the SourceControl.GetLatest function (Line #20).  I also added a call to the psake include function (Line #10) passing in the following “Packages\SourceControl.SVN.ps1”.  Here is what the default.ps1 looks like now:

image

So if I wanted to support SourceGear Vault I would simply create another package file called SourceControl.Vault.ps1 and place the implementation inside the GetLatest function and change the include statement in the default.ps1 script to reference the vault version of source control.  I plan on adding in support for my Unit Tests the same way I did the source control, that way I can easily have support for multiple unit testing frameworks.

Conclusion

As you can see it is pretty easy to use PowerShell to replace your build process.  This post was just a short introduction on how you might get started and end that crazy XML syntax that has been used for so long in build scripting.  I have a lot more to do on this to make it actually usable for some of my small projects but hopefully I can evolve it into something that will be easy to maintain and reliable.  All in all I think PowerShell has some pretty cool ways of scripting some nice build solutions. 

BX24 and PowerShell for managing a build process

by Mike Linnen 18. July 2006 21:01
BX24 and PowerShell for managing a build process

I have been doing some BX24 development again lately.  I have also been reading a lot about the new shell support that Microsoft has pre-released called PowerShell (formerly known as Monad).  Well since I have been using the same batch files and VBScript files to manage my build process for BasicX source since 2001 I thought it might be time to look at another alternative. 

I need to be able to do the following:
  • Perform command line compiles of the BX24 project
  • Allow for the source to reside anywhere on the hard drive and still be able to compile.
  • Initiate a compile of all BX24 projects so I do not have to do them one at a time
  • Parse the BasicX.err file to determine if the compiler found errors
  • Launch an editor that shows the BasicX.err file only when an error exists
  • Be able to manage some registry entries specific to the BasicX IDE
  • Have a limited set of scripts that do not require any changes to support the build process
  • Allow for multiple project files to co-exist in the same folder. This means I need to save off the BasicX.err file into another file if I want to preserve what the results where from the compile.

After reading some about PowerShell it was very apparent that it would support anything I needed to do.  The main huddle I needed to over come was learning the syntax that revolved around PowerShell.  Fortunately it is based on the .Net framework so the majority of it was fairly easy to adjust to. 

Since I already had a VBScript file that did most of the above tasks I started dissecting what it did first.  The last time I touched this script was in 2001.  The script did the pieces around changing the registry entries and launching the compiler but it had no support for parsing the error file and managing many project files.  Here is the script that I ended up with:

param ([string]$WorkingDirectory)
# Define some script variables$chip_type="BX24"
# Save the current dirrectory so we can return to it
Push-Location
# If a working directory was passed in lets change to it
If ($WorkingDirectory){Set-Location $WorkingDirectory}
# Get the project files to process
$projectFiles = Get-ChildItem *.bxp 
foreach ($project in $projectFiles){$project_file = $project.name.split(".")[0]
# Use the current directory as the working directory
$work_dir = $project.DirectoryName
# Set some registry entries for the basicx IDE
$configEntry = "hkcu:\software\vb and vba Program Settings\basicx\config"
Set-ItemProperty ($configEntry) -Name Chip_Type -value 
$chip_typeSet-ItemProperty ($configEntry) -Name Work_Dir -value 
$work_dir
# determine from the registry where the basicx executable is installed
$program_dir = Get-ItemProperty ($configEntry) -Name Install_Directory
# Map the P drive to the basicx install directory for convieniance
if (Test-Path p:) {}else {subst P: $program_dir.Install_Directory}
# Remove the error file if it exists
if (Test-Path basicx.err){del basicx.err}
if (Test-Path ($project_file + ".err")){del ($project_file + ".err")}
# Launch the compiler
P:\basicx.exe $project_file /c
# Wait for the compiler to finish
$processToWatch = Get-Process basicx$processToWatch.WaitForExit()
# Unmap P: drive
if (Test-Path p:){subst P: /d}
# Check for errors and launch the error file if some do exist
$CompileResult = get-content basicx.err
If (($CompileResult -match "Error in module").Length -gt 0){notepad basicx.err}
# Copy the error file off so it does not get overwritten when multiple
# projects are being compiled in a single directory
copy-item basicx.err -destination ($project_file + ".err")} 
# Restore the original location
Pop-Location

Well that was pretty painless.  I basically had a script that managed processing all BasicX project files in a given folder.  Next I needed to have another script that found all the project folders for a given folder.  This also meant processing projects in sub folders.  This higher level script would launch the script above to do the compile.  I ended up with the following script:

# Save the current dirrectory so we can return to it
Push-LocationSet-Location ..\
# Get a list of all projects
$project_Files = Get-ChildItem -recurse -include *.bxp | sort $_.DirectoryName$lastDir=""
foreach($project in $project_Files)
{
# Since we can have multiple projects in a folder and we send the
# working folder to the build script we want to skip folders we already
# processed
if ($lastdir -ne $project.DirectoryName)
{./tools/build $project.DirectoryName  $lastDir = $project.DirectoryName}}
Pop-Location

Well that too was pretty easy.  I am beginning to really respect the power of PowerShell.  I can do so much more than what I was able to do with VBScript and do it easier.  Later I will but together a sample BX24 project showing how I use these scripts and the folder structure I place them in.

About the author

Mike Linnen

Software Engineer specializing in Microsoft Technologies

Month List