Archive

Posts Tagged ‘CI’

Having Jasmine tests results in TeamCity via node.js (on windows) invoked from powershell

September 26th, 2011 2 comments

I test my javascript code via jasmine on a windows machine. I primarily write jquery-plugin-style code. Now I need to get this onto CI. A colleague I worked with took my test strategy in jasmine and wrote a library to run it very quickly on node.js. There are other places showing how to integrate with TeamCity and Jasmine with JsTestDriver or Qunit and also the documentation on how to easily integrate service messages with TeamCity.

One caveat: I am not wanting to test cross-browser functionality. Therefore, I don’t need or want a browser or the associated slowness and the brittleness of cross-process orchestration. (Note: I have tried to stabilise these types of tests using NUnit and MSTest runners invoking selenium and/or watin – it gets unstable quickly and there is too much wiring up.)

So this approach is simple and blindly fast thanks to Andrew McKenzie’s jasmine-node-dom which is an extension of jasmine-dom. He wrote his for linux and his example is with Jenkins so I have forked a version for windows which has a node.exe binary which is available form node.js.

Anyway, this blog covers the powershell script to invoke the jasmine-node-dom and publish it to TeamCity.

Here’s the script:

build.ps1 (or directly in TeamCity)

	
	$node_dir = "node-jasmine-dom\bin"	

	& "$node_dir\node.exe" "$node_dir\jasmine-dom" `   
				--config tests.yaml `
				--format junit `
				--output javascript-results.xml 

	write-host "##teamcity[importData type='junit' path='javascript-results.xml']"    
	

An explanation if it isn’t obvious. First let’s start with files that are needed. I have the windows node-jasmine-dom installed in its own directory. I then call node.exe with jasmine-dom. That should just work all out-of-the-box. I then tell it where the manifest is that knows about the tests (tests.yaml – see below for example) and then I give it the results file. jasmine-node-dom is great because it reads the SpecRunner.html and reconstructs the DOM enough that the tests are valid.

Finally, I tell teamcity to read the results out of junit. This is very easy and I recommend that you find out what else you need to do.

tests.yaml

	---
	  test_one:
	    name: Example test one
	    runner: ./tests/SpecRunner.html

This yaml file points to the Jasmine runner.

Other points:

* All my jasmine tests are invoked from their own SpecRunner.html file by convention
* I will write a script that will automatically generate the yaml file
* I always put all my powershell scripts into psake scripts (then they can be run by the dev or the build machine)
* my code isn’t quite filed as above

Summary Instructions:

# download jasmine-node-dom and install in tools\ directory
# add new Task to build scripts (I use psake)
# add a new test.yaml manifest (or build one each time)
# Add new jasmine tests via SpecRunner.html with your javascript
# Ensure that the build script is run via a build step in a configuration from TeamCity
# Now you can inspect the Tests in TeamCity

Categories: Uncategorized Tags: , , , , ,

Sharepoint & TDD: Getting started advice

July 1st, 2011 4 comments

I have a couple of people asking lately about starting on SharePoint. They’ve asked about how to move forward with unit and integration testing and stability. No one wants to go down the mocking route (typemock, pex and moles) and quite rightly. So here’s my road map:

The foundations: Hello World scripted and deployable without the GUI

  1. Get a Hello World SharePoint “app” – something that is packageable and deployable as a WSP
  2. Restructure the folders of the code away from the Microsoft project structure so that the root folder has src/, tools/, lib/ and scripts/ folders. All source and tests are in src/ folder. This lays the foundation for a layered code base. The layout looks like this sample application
  3. Make the compilation, packaging, installation (and configuration) all scripted. Learn to use psake for your build scripts and powershell more generally (particularly against the SharePoint 2010 API). The goal here is that devs can build and deploy through the command line. As such, so too can the build server. I have a suggestion here that still stands but I need to blog on improvements. Most notably, not splitting out tasks but rather keeping them in the same default.ps (because -docs works best). Rather than get reuse at the task level do it as functions (or cmdlets). Also, I am now moving away from the mix with msbuild that I blogged here and am moving them into powershell. There is no real advantage other than less files and reduced mix of techniques (and lib inclusions).
  4. Create a build server and link this build and deployment to it. I have been using TFS and TeamCity. I recommend TeamCity but TFS will suffice. If you haven’t created Build Definitions in TFS Workflow allow days-to-weeks to learn it. In the end, but only in the end, it is simple. Becareful with TFS, the paradigm here is that build server does tasks that devs don’t. It looks a nice approach. I don’t recommend it and there is nothing here by design that makes this inevitable. In TFS, you are going to need to build two build definitions: SharePointBuild.xaml and SharePointDeploy.xaml. The build is a compile, package and test. The deploy simply deploys to an environment – Dev, Test, Pre-prod and Prod. The challenge here is to work out a method for deploying into environments. In the end, I wrote a simple self-host windows workflow (xamlx) that did the deploying. Again, I haven’t had time to blog the sample. Alternatively, you can use psexec. The key is that for a SharePoint deployment you must be running on the local box and the most configurations have a specific service account for perms. So I run a service for deployment that runs under that service account.

Now that you can reliably and repeatably test and deploy, you are ready to write code!

Walking Skeleton

Next is to start writing code based on a layered strategy. What we have found is that we need to do two important things: (1) always keep our tests running on the build server and (2) attend to keeping the tests running quickly. This is difficult in SharePoint because a lot of code relates to integration and system tests (as defined by test automation pyramid). We find that integration tests that require setup/teardown of a site/features get brittle and slow very quickly. In this case, reduce setup and teardown in the the system tests. However, I am also had a case where the integration test showed that a redesigned object (that facaded SharePoint) would give better testability for little extra work.

  1. Create 6 more projects based on a DDD structure (Domain, Infrastructure, Application, Tests.Unit, Tests.Integration & Tests.System). Also rename your SharePoint project to UI-[Your App], this avoids naming conflicts on a SharePoint installation. We want to create a port-and-adapters application around SharePoint. For example, we can wrap property bags with repository pattern. This means that we create domain models (in Domain) and return them with repositories (in Infrastructure) and can test with integration tests.
  2. System tests: I have used StoryQ with the team to write tests because it allows for a setup/teardown and then multiple test scenario. I could use SpecFlow or nBehave just as easily.
  3. Integration tests: these are written classical TDD style.
  4. Unit tests: these are written also classical TDD/BDD style
  5. Javascript tests: we write all javascript code using a jQuery plugin style (aka Object Literal) – in this case, we use JSSpec (but I would now use Jasmine) – we put all tests in Tests.Unit but the actual javascript is still in the UI-SharePoint project. You will need two sorts of tests: Example for exploratory testing and Specs for the jasmine specs. I haven’t blogged about this and need to but is based on my work for writing jQuery plugins with tests.
  6. Deployment tests: these are tests that run once that application is deployed. You can go to an ATOM feed which returns the results of a series of tests that run against the current system. For example, we have the standard set with tells us the binary versions and which migrations (see below) have been applied. Others check whether a certain wsp has been deployed, different endpoints are listening, etc. I haven’t blogged this code and mean to – this has been great for testers to see if the current system is running as expected. We also get the build server to pass/fail a build based on these results.

We don’t use Pex and Moles. We use exploratory testings to ensure that something actually works on the page

Other bits you’ll need to sort out

  • Migrations: if you have manual configurations for each environment then you’ll want to script/automate this. Otherwise, you aren’t going to be one-click deployments. Furthermore, you’ll need to assume that each environment is in a different state/version. We use migratordotnet with a SharePoint adapter that I wrote – it is here for SharePoint 2010 – there is also a powershell runner in the source to adapt – you’ll need to download the source and compile. Migrations as an approach works extremely well for feature activation and publishing.
  • Application Configuration: we use domain models for configuration and then instantiate via an infrastructure factory – certain configs require SharePoint knowledge
  • Logging: you’ll need to sort of that Service Locator because in tests you’ll swap it out for Console.Logger
  • WebParts: can’t be in a strongly typed binary (we found we needed another project!)
  • Extension Methods to Wrap SharePoint API: we also found that we wrapped a lot of SharePoint material with extension methods

Other advice: stay simple

For SharePoint developers not used to object oriented programming, I would stay simple. In this case, I wouldn’t create code with abstractions that allowed you to unit test like this. I found in the end the complexity and testability outweighed the simplicity and maintainability.

Microsoft itself has recommended the Repository Pattern to facade the SharePoint API (sorry I can’t for the life of me find the link). This has been effective. It is so effective we have found that we can facade most SharePoint calls in two ways: a repository that returns/works with a domain concept or a Configurator (which has the single public method Process()).Anymore than that it was really working against the grain. All cool, very possible but not very desirable for a team which rotates people.

SharePoint TDD Series: Maintainability over Ease

December 16th, 2010 No comments

This series is part of my wider initiative around the Test Automation Pyramid. Previously I have written around Asp.Net MVC. This series will outline a layered code and test strategy in SharePoint.

SharePoint is a large and powerful system. It can cause problems in the enterprise environment incurring delays, cost and general frustration. Below is an overview of the main areas of innovation made in the source code to mitigate these problems. These problems are because the fundamental design of SharePoint is to design an “easy” system to code. It is easy in this sense is a system that can be configured up by general pool of developers and non-developers a like. Such a design however does not necessarily make the system maintainable. Extension, testability and stability may all suffer. In enterprise environments these last qualities are equally if not more important to the long-term value of software.

This series of posts outlines both the code used and the reasons behind it usage. As such it is a work in progress that will need to be referred to and updated as the code base itself changes.

Deployment Changes

Layered Code with testing

Testing on Event Receiver via declarative attributes

Testing Delegate controls which deploy jQuery

  • Part 5 – Client side strategies for javascript
  • Part 6 – Unit testing the jQuery client-side code without deploying to SharePoint
  • Part 2 – Unit testing the delegate control that houses the jQuery
  • Part 4 – Exploratory testing without automation is probably good enough

Cross-cutting concerns abstractions

SharePoint deployment packaging

November 27th, 2010 No comments

In this blog post I discuss a strategy for adding migrations and scripting to the deployment of SharePoint solutions. What was implicit in their was that we were packing the SharePoint solutions (wsp file) into a release package that had scripting. I documented here the types of scripts we used in order to get automated deployments.

Taking a step back, the reasons we went down this approach was that in practice:

  1. in the first sprint, we lost 2.5 days times 3 developers worth of work alone in SharePoint API (funky-ness: eg poor error reporting, API bugs)
  2. developers are often waiting long periods of time for deployments
  3. when we did do “quick” deploys they were unreliable (ie pushing into the 14 hive), say with a javascript deployment (eg CKS.Dev.Server.vsix)

To do this, we had a take step back from delivering functions and delivering stability. This actually took us about a month to get under control – I wish I could send a particular corporation the bill for an under-finished product.

Therefore, beware. If you are reading this and this is new to you, you are likely to need to build in an initial theme over the first month or so: Stability of code base and its deployment. This is of course not a problem exclusive to SharePoint.

Approach:

  • map out the value stream for deployment through environment, identify pain points (result: scripting but not automation)
  • start scripted deployments
  • solve instability issues

Solutions:

  • versioned release packages – see below for the packing.proj msbuild tasks
  • scripted deployments
  • migrations as part of deployment
  • general rule of no

Results

  1. a standard deployment now takes between 1-3 mins (each scripted installation reports time taken)
  2. the biggest lag is still the bootstrapping process that SharePoint undergoes with a new deployment

package.proj

This packaging project is an adaption on the sample deployment project found here. In that sample, you’ll see the need for dependencies in the lib/ folder (such as 7zip for compression). Down in the script you might also notice that we include migratordotnet for managing releases which actually requires original binaries – I need to get back to this issue and try not to included binaries like this.

<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="HelpBuild"  ToolsVersion="3.5" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <PropertyGroup>
	    <Major>0</Major>
	    <Minor>1</Minor>
	    <Build>0</Build>
	  </PropertyGroup>

  <PropertyGroup>
    <Revision Condition="'$(Revision)'==''">0</Revision>
    <Version Condition="'$(Version)'==''">$(Major).$(Minor).$(Build).$(Revision)</Version>
    <DropLocation Condition="'$(DropLocation)'==''">$(MSBuildProjectDirectory)\CodeToDeploy\Publish</DropLocation>
    <BuildCmd Condition="'$(BuildCmd)'==''">ReBuild</BuildCmd>
    <ReleaseEnvironment Condition="'$(ReleaseEnvironment)'==''">Test</ReleaseEnvironment>


    <ReleaseName>MySites-$(ReleaseEnvironment)</ReleaseName>
    <ReleasePath>$(DropLocation)\..\Releases</ReleasePath>
    <DropLocationWsp>$(DropLocation)\wsp</DropLocationWsp>
    <BinariesRoot>$(MSBuildProjectDirectory)\src\Site</BinariesRoot>
    <LibRoot>$(MSBuildProjectDirectory)\lib</LibRoot>
    <WspRoot Condition="'$(WspOutDir)'==''">$(BinariesRoot)</WspRoot>
    <ReleaseZipFile>$(ReleasePath)\Site-$(Version).zip</ReleaseZipFile>

    <ExtractPath>$(DropLocation)\..\Deploy</ExtractPath>

    <Zip>$(MSBuildProjectDirectory)\lib\7z\7z.exe</Zip> 
  </PropertyGroup>

  <ProjectExtensions>
    <Description>Build Releasable MySites</Description>
  </ProjectExtensions>

  <ItemGroup>
    <SolutionToBuild Include="src\Site.sln">
      <Properties>Configuration=Release;Platform=x64;OutDir=$(WspRoot)\bin\x64\Release\</Properties>
    </SolutionToBuild>
    <WspToBuild Include="$(WspRoot)\UI.csproj">
      <Properties>Configuration=Release;Platform=x64;OutDir=$(WspRoot)\bin\x64\Release\</Properties>
    </WspToBuild>
    <WspFiles Include="$(BinariesRoot)\**\*\*.wsp" />
  </ItemGroup>

  <ItemGroup>
	  
    <TasksFiles Include="scripts\deploy.ps1" />
    <TasksFiles Include="scripts\deploy\migrations.ps1;
                         scripts\deploy\deploy.ps1;
                         scripts\psake.psm1;
                         scripts\deploy\install.ps1" />
    
    <MigrationFiles Include="$(WspRoot)\bin\x64\Release\Infrastructure.dll;
                             $(WspRoot)\bin\x64\Release\Domain.dll" /> 
    <MigratorFiles Include="$(LibRoot)\migratordotnet\*" /> 
  </ItemGroup>

  <Target Name="Package" DependsOnTargets="Clean;Version-Writeable;Version;Compile;Version-Reset;Version-ReadOnly;Publish;Zip"/>
  <Target Name="Install" DependsOnTargets="Package;Extract;Deploy"/>

  <Target Name="Compile">
    <MSBuild Projects="@(SolutionToBuild)" Targets="Rebuild" />
    <CallTarget Targets="PackageWsp" />
  </Target>

  <Target Name="PackageWsp">
    <MSBuild Projects="@(WspToBuild)" Targets="ReBuild;Package" />
  </Target>

  <Target Name="Publish">
    <MakeDir Directories="$(DropLocationWsp)" Condition = "!Exists('$(DropLocationWsp)')" />
    
    <Copy SourceFiles="@(WspFiles)" DestinationFolder ="$(DropLocationWsp)" SkipUnchangedFiles="true"/>
    <Copy SourceFiles="@(TasksFiles)" DestinationFolder ="$(DropLocation)\scripts\%(TasksFiles.RecursiveDir)" SkipUnchangedFiles="true" />
    
    <Copy SourceFiles="@(MigrationFiles)" DestinationFolder ="$(DropLocation)\lib\%(MigrationFiles.RecursiveDir)" SkipUnchangedFiles="true" />
    <Copy SourceFiles="@(MigratorFiles)" DestinationFolder ="$(DropLocation)\lib\migratordotnet\%(MigratorFiles.RecursiveDir)" SkipUnchangedFiles="true" />
    
    <Exec Command="echo .> &quot;$(DropLocation)\$(Version)&quot;"/>
    
    <Exec Command="echo powershell -ExecutionPolicy Unrestricted -Command &quot; &amp;{import-module .\scripts\psake.psm1; Invoke-Psake Install}&quot; > &quot;$(DropLocation)\Install.bat&quot;"/>
    <Exec Command="echo powershell -ExecutionPolicy Unrestricted -NoExit -Command &quot; &amp;{import-module .\scripts\psake.psm1; Invoke-Psake -docs}&quot; > &quot;$(DropLocation)\here.cmd&quot;"/>  
    <Exec Command="echo Include .\scripts\migrations.ps1; Include .\scripts\deploy.ps1; Include .\scripts\install.ps1 > &quot;$(DropLocation)\default.ps1&quot;"/>  
  </Target>

  <Target Name="Clean" DependsOnTargets="CleanPublish;CleanReleases" >
    <RemoveDir Directories="$(DropLocation)\.." />
  </Target>

  <Target Name="CleanPublish">
    <RemoveDir Directories="$(DropLocation)" />
  </Target>

  <Target Name="CleanReleases">
    <RemoveDir Directories="$(ReleasePath)" />
    <RemoveDir Directories="$(ExtractPath)" />
  </Target>

  <Target Name="Zip">
    <MakeDir Directories="$(ReleasePath)" Condition = "!Exists('$(ReleasePath)')" />
    <Exec Command="$(Zip) a -tzip %22$(ReleaseZipFile)%22" WorkingDirectory="$(DropLocation)"/>
  </Target>

  <Target Name="Extract">
    <MakeDir Directories="$(ExtractPath)"  Condition = "!Exists('$(ExtractPath)')"/>
    <Exec Command="$(Zip) x %22$(ReleaseZipFile)%22 -o$(ExtractPath)" WorkingDirectory="$(DropLocation)"/>
  </Target>

  <Target Name="Deploy">
    <Exec Command="$(ExtractPath)\deploy.bat $(ExtractPath)"  WorkingDirectory="$(ExtractPath)" ContinueOnError="false" />
  </Target>


  <Target Name="HelpBuild">
    <Message Text="

    msbuild /t:Package

    Examples:
      msbuild /t:Install
      msbuild /t:Install /p:ReleaseEnvironment=Test
      msbuild @build.properties /t:Package /v:d 

    Variables that can be overridden:
      DropLocation=C:\Binaries
      ReleaseEnvironment=Dev|[Test]|Prod
      BuildCmd=Build|[Rebuild] 

    Targets:
     - Compile
     - Clean
     - CleanReleases
     - Publish
     - Zip
     - Extract
     - Deploy
     - PackageWsp

     - Package (Compile;Clean;Publish;Zip)
     - Install (Compile;Clean;Publish;Zip;Extract;Deploy)

    Log output: msbuild.log
             " />
  </Target>

</Project>	
Categories: Uncategorized Tags: , ,

sharepoint-deployment-scripting-with-powershell

November 26th, 2010 No comments

PowerShell scripting approach for SharePoint (via psake)

This is a simple plea to SharePointers. Can we please have a decent build runner around PowerShell scripts in the deployment environment. Most developers in other frameworks do. So I thought that I would outline some basic techniques. psake is as good as any other options (except rake which I still think is vastly superior but I don’t want the operational hassle of needing it in other environments). Here’s what I’m doing:

  • powershell wrapped in psake
  • modularised scripts
  • psake helpers scripts for the real world (and documented in other places)
  • helper scripts for each of access to command line powershell

One caveat, psake has a couple of limitations that make it otherwise a good rake port:

  • it doesn’t namespace (AFAIK) tasks and that means it effectively works in the global namespace – this means we get conflicts and have to resort to poor naming practices
  • it doesn’t create good documentation for nested scripts – that seriously sucks – I want to be able to do -docs and read what I can do
  • it really doesn’t do logging well when moving across scripts (Start-Transcript seems to suck – but that might be my knowledge)

If you don’t go down this road what you end up with is a load of powershell scripts that we double click on. That’s not my idea of well structured or maintainable. Skip straight down to deploy.ps1 if want to see what the scripts look like rather than all the plumbing. And, yes, there is way too much plumbing! Here is the script where I actually do the packaging up of these files.

File layout

default.ps1
here.cmd
deploy.cmd
scripts/
    psake.psm1
    psake1.ps1
    deploy.ps1
    install.ps1
wsp/
    MySolution.wsp

Quick explanation:

  • psake1.ps1, psake.psm1 – both come with psake. psm is a module and the make DSL engine – go there to understand what to do. psake1 is a help wrapper.
  • default.ps1 is a manifest that links all your psake tasks – psake looks for default.ps1 by, well, default.
  • deploy.cmd is a GUI-based wrapper that invoke a specific tasks (deploy in this case)
  • here.cmd brings up a command line interface so that I can use all the psake tasks (eg Invoke-psake Deploy)
  • deploy.ps1, install.ps1 are my two specific sets of tasks – deploy has SharePoint cmdlets linked in here (install is a further wrapper that does logging and in my own case also invokes other scripts I haven’t included like migrations)

How to use

Option One:

  1. Click deploy.cmd

Option Two:

  1. Click here.cmd – this will list out the available commands
  2. Be able to type in powershell commands
Invoke-psake Deploy

Files

deploy.cmd

powershell -ExecutionPolicy Unrestricted -NoExit -Command "&{import-module .\scripts\psake.psm1; Invoke-Psake Install}"

default.ps1

Include .\scripts\deploy.ps1; 
Include .\scripts\install.ps1

here.cmd

powershell -ExecutionPolicy Unrestricted -NoExit -Command " &{import-module .\scripts\psake.psm1; Invoke-Psake -docs}"

psake.ps1

import-module .\scripts\psake.psm1

Start-Transcript -Path .\install.log
invoke-psake @args
Stop-Transcript

remove-module psake

deploy.ps1

$framework = '3.5x64'
Properties {
  $base_dir = resolve-path .
    $solution = "MySolution.wsp"
    $application = $null
    $path = "$base_dir\wsp\$solution"
    $SolutionFileName = ""
}

Task Deploy -Depends Solution-Setup, Solution-UnInstall, Solution-Remove, Solution-Add, Solution-Install, Solution-TearDown

Task Solution-Remove -Depends Solution-Wait{
  Write-Host 'Remove solution'
  Remove-SPSolution ñidentity $solution -confirm:$false
}

Task Solution-Add {
  Write-Host 'Add solution'
  Add-SPSolution -LiteralPath $path
}

Task Solution-Install {
  Write-Host 'install solution'
  if($application -eq $null)
  {
      Install-SPSolution -identity $solution -GACDeployment -force
  }
  else
  {
      Install-SPSolution -identity $solution -application $application -GACDeployment -force
  }
}

Task Solution-UnInstall {
  Write-Host 'uninstall solution'
  if($application -eq $null)
  {
     Uninstall-SPSolution -identity $solution -confirm:$false
  }
  else
  {
     Uninstall-SPSolution -identity $solution -application $application -confirm:$false
  }  
}

Task Solution-Setup {
  Add-PsSnapin Microsoft.SharePoint.PowerShell
}

Task Solution-TearDown -Depends Solution-Wait {
  Remove-PsSnapin Microsoft.SharePoint.PowerShell
}

Task Solution-Wait {
  $JobName = "*solution-deployment*$SolutionFileName*"
    $job = Get-SPTimerJob | ?{ $_.Name -like $JobName }
    if ($job -eq $null) 
    {
        Write-Host 'Timer job not found'
    }
    else
    {
        $JobFullName = $job.Name
        Write-Host -NoNewLine "Waiting to finish job $JobFullName"

        while ((Get-SPTimerJob $JobFullName) -ne $null) 
        {
            Write-Host -NoNewLine .
            Start-Sleep -Seconds 2
        }
        Write-Host  "Finished waiting for job.."
    }
}

install.ps1

Task Install -Depends Logging-Start, Deploy, Logging-Stop

Task Logging-Start {
  Start-Transcript -Path .\install.log
}

Task Logging-Stop {
  Stop-Transcript
}
Categories: Uncategorized Tags: , , ,

sharepoint-deployment-in-enterprise-environment

November 26th, 2010 2 comments

SharePoint Installations: Strategy to aid continuous integration

This document outlines an approach to managing SharePoint installations. Starting from the current deployment approach which is primarily manual, an improved approach is outlined that manages the difference between deployments and migrations and aid the automation of scripting through all environments.

Overview

For continuous integration (CI), we generally want the approach to move a single package for a SharePoint solution through all environments. Secondarily, we want to be scripted and preferably automated. To put the foundations in place we have had to devise a clear strategy. Microsoft does not provide all the tools and techniques for SharePoint continuous integration. In fact, in most cases for new features such as customisation of site page or workflow creation must be performed manually in each environment.

Shortcomings of SharePoint deployments: no real versioning across environments

SharePoint has a well established set of techniques for deployment. Most easily, there are through the GUI approaches that come with Visual Studio or are able to added into Visual Studio. There are other tools that aid deployments through a GUI in a wizard style (on CKS: Development Tools Edition on Codeplex). Alternatively, you can deploy solutions using stsadm and PowerShell cmdlets. Finally, you can create your own custom deployment steps (IDeploymentStep) and deployment configurations (ISharePointProjectExtension) by implementing the core SharePoint library and add these to your solutions and through the deployment build lifecycle. This is all well explained in SharePoint 2010 Development with Visual Studio by Addison Wesley.

All of this allows for custom deployment and retraction of a SharePoint solution with bias toward GUI-based deployment and certainly manual configuration of sites. Manual configuration is particularly problematic in the enterprise environment that wishes to use continuous integration.

There is a missing piece of the puzzle that could aid automation in the moving of solutions through environments. This is the idea of migrations. Migrations are simply a convenient way for you to alter the SharePoint site in a structured and organised manner. So while you could activate features through the GUI by telling operations or administrators you would then be responsible for writing documentation. You’d also need to keep track of changes of which changes need to be run against each environment next time you deploy. Instead, migrations work against the SharePoint API and are versioned. Each SharePoint instance has a record of which version migrations have been run. It applies on new migrations to the site.

Current Approach: without migrations and bias toward GUI

Table 1 outlines the current installation approach is scripted deployments via powershell. Provisioning occurs as part of the SharePoint. However, there is also a lot more configuration required that is currently performed manually through the GUI. This leads to problems in installations across multiple environments. It is also costly in terms of time for training, documentation and problem solving.

Deployment

Provisioning

Configuration

Scripted

GUI

Automated

Manual

Table 1:             Previous State for SharePoint installations

In contrast, Table 2 splits deployment into deployment and migrations and has a bias towards scripting and automation over GUI and manual configuration.


Deployment

Migrations

Provisioning

Configuration

Scripted

GUI

Automated

Manual

Table 2:             New State for SharePoint installations

Approach With Migrations and bias toward scripting

We approach this problem by isolating the installation of sharepoint solutions through provisioning and configuration phases as per Table 3. Provisioning occurs when a new “feature” is deployed. This occurs in the Event Receiver code callbacks.

Configuration is actually an implicit concept in SharePoint. Activation is post-provisioning and can occur on features once provisioned. Configuration on the other hand is the main activity of site administrators in SharePoint – this is the gluing together of the system. For most SharePoint installations, these two activities are usually performed through the GUI by people. As such, this is not conducive to enterprise systems where new code and configurations need to be moved through many environments.


Provisioning

Configuration

Table 3:             Installation process split between provisioning and
configuration

Table 4 outlines the difference between sripted tasks and GUI-based tasks and that there should be a preference toward scripting. Scripted tasks tend to be written in C# using SharePoint API to perform tasks (or as cmdlets). They merely reproduce tasks that would normally be done in the GUI. In most cases, the functionality is the same. However, in a percentage of cases we find that each approach has its own quirks that need to be ironed out. This is very useful for finding issues early on.


Provisioning

Configuration

Scripted

GUI

Table 4:             Tasks should be scripted more than through the GUI

Table 5 outlines the distinction between automated and manual tasks. To aid maintainability through environments tasks must be scripted for continuous integration automation. However, in practice not all tasks should be exclusively automated. Many scripted tasks must also be able to be run within continuous builds and be run manually.


Provisioning

Configuration

Scripted

GUI

Automated

Manual

Table 5:             Automation over Manual – but manual still required

Finally in Table 6, for all of this work we there have split the installation process into two approaches: deployments and migrations. Deployments are scripted in powershell (and msbuild) and have the ability to compile, package and deploy the SharePoint solutions. Deployments trigger provisioning in the SharePoint system by design of SharePoint. What is then missing is the ability to (re)configure the system per deployment. Migrations do this job. They apply a set of changes per installation – they are a little more powerful than this. They can actually apply set of changes in order per installation and keep these versioned. Moreover they can work transactionally and retract changes upon errors (and this can be either automated or done manually). It is based on a technique used for database schema changes – in fact, we have extended an open source library to do the job for us.

Deployments also extend the simplistic notion of only needing a wsp for deployment and include creating zip packages, extracting and deploying solutions via powershell and SharePoint API. The result is not only the standard deployment: wsp uploaded into the solution store; binaries are deployed. But also, as part of the deployment, the powershell scripts may also provision solutions that are usually provisioned through the GUI by the administrator. This will often be a prerequisite for a migration.

Migrations are versioned actions that run after the deployment. Migrations may activate, install/upgrade or configure a feature (or act upon the 14 hive). Many of these tasks have previously been performed through the GUI.

Deployment

Migrations

Provisioning

Configuration

Scripted

GUI

Automated

Manual

Table 6:             Installation is now completed via deployment and migrations

Changes in the code base

Having this strategy means some changes in the code base. Here are some of things we see:

Migration Classes: We now have the configuration code implemented as part of the code base – this includes the ability to make configurations roll forward or backwards

Automation: Configuration can now be performed by build agents

Migration Framework: We have used migratordotnet to perform the migration but did have to extend the library to cater for SharePoint

PowerShell scripts: migrations can be run either automatically as part of installations or run independently by operations – we have also used psake as a build DSL around PowerShell

So far, we haven’t included any code samples. These are to come.

Categories: Uncategorized Tags: ,