Home > Uncategorized > SharePoint TDD Review

SharePoint TDD Review

November 26th, 2010 Leave a comment Go to comments

I’ve been looking around what others have writing about testing SharePoint and before I do a post on how I have gone about it here’s a quick review. It seems there are the usual positions:

  • don’t do it because it doesn’t add value and is difficult given the state of the product, community, developers and product owners
  • don’t do it because it unnecessarily expands the code base
  • do it using mocking frameworks (TypeMock or Pex/Moles)
  • don’t mock because there are too many dependencies and you really need to test in the environment and just testing using units in this way is misleading
  • do it but don’t mock out what are actually integration tests – so appropriately layer your code and tests

So where I got to is that:

  • we need to write SharePoint using layering (SOLID)
  • we need to put an abstraction in front of SharePoint in our code (interfaces or late-bound lambdas – or both)
  • we need to write SharePoint API code in ways that are still accessible to developers who are at home in the SharePoint world – so the code that uses SharePoint is altogether (in practice, this would look like driving the API calls out of WebParts, Event Receivers and into their own classes – think ports and adaptors technique)
  • we need to layer our tests also (unit and integration – see my posts on test automation pyramid)
  • use mocks in unit tests – so no dependencies (which requires abstractions)
  • don’t mock (SharePoint) in integration tests (steel thread/walking skeletons will require this but that around getting code to work that isolates SharePoint)
  • you will be required to use DI (but that doesn’t mean IOC) – this will be easy if you layer your code and follow the rule in hand in dependencies on constructors

So writing good (layered and testable) code is the core issue. Or in the words of Freeman and Price, let’s write maintainable code over code that is easy to write. SharePoint examples are all about the marketing machine showing just how easy it is to use. That, though, isn’t maintainable code from the TDD perspective.

Here’s my most interesting, immediate experience in unit testing SharePoint – we couldn’t use MSTest. I kid you not. It wouldn’t work on 64-bit machines. We found others had had the problem. Go figure.

General case against:

Basic argument that I have a lot of sympathy for (except if I have to be the developer who puts his name against the system and has to do support because this position gives me no protection). This comment shows a really good understanding of the complexities.

So, now lets take a typical SharePoint project. Of course there is a range and gamut of SharePoint projects, but lets pick the average summation of them all. A usual SharePoint project involves many things,

  • Customer communication and prototyping – usually much much more than a typical .NET custom dev project.
  • Plenty of attention to the IT Pro side, where you decide logical and physical layouts of your servers, networks, and your site collections. Custom dev .NET projects are usually simpler to setup and plan.
  • A significant effort in branding – usually greater than your typical .NET project, simply because the master pages and CSS and JS of SharePoint are much more complex.
  • Focus on scalability, when balancing between requirements, and best practices, and experience!
  • Writing some code
  • Establishing roles within your team, which is different from the nature of the SharePoint project. Usually this is more involved in an SP project than a custom .NET dev project, simply because there is a higher overlap between ITPro and Dev in the case of SP.
  • Training required – your business users will need training on the project ~ again, usually more significant than a simple .NET custom dev project.
  • Architecture, and functionality bargaining per RUP for COTS - where there are many ways of achieving a goal, and the way picked is a delicate balance between budget, demands, and technology.

Thus in a typical SharePoint project, the portion where TDD is actually applicable is very small – which is the writing code part. In most, not all, SharePoint projects, we write code as small bandaids across the system, to cover that last mile – we don’t write code to build the entire platform, in fact the emphasis is to write as little code as possible, while meeting the requirements. So already, the applicability of TDD as a total percentage of the project is much smaller. Now, lets look at the code we write for SharePoint. These small bandaids that can be independent of each other, are comprised of some C#/VB.NET code, but a large portion of the code is XML files. These large portion of XML files, especially the most complex parts, define the UI – something TDD is not good at testing anyway. Yes I know attempts have been made, but attempt But in most scenarios, the answer is usually no. So, TDD + SharePoint 2007? Well – screw it! Not worth it IMO.

This is supported in Designing solutions for microsoft® sharepoint® 2010 (p.229)

SharePoint developers have traditionally been slow to adopt modern approaches to soft- ware testing. There are various reasons for this. First, solution development for Microsoft® SharePoint® is often a specialized, fairly solitary process that isn’t automated or streamlined to the same degree as mainstream software development. Second, the ASP. NET programming model that underpins SharePoint development does not naturally lend itself to automated testing. Finally, most of the classes and methods in the SharePoint API do not implement interfaces or override virtual methods, making them difficult to “stub out” using conventional mocking techniques. However, automated, robust testing is an increasingly essential part of enterprise-scale SharePoint development.


From Comment 5 at 2/18/2009 6:56 PM

When I’m doing SharePoint development the last thing I think of (or one of the last) is the SharePoint infrastructure. For me, there are two key things that I always do. First, I build an abstraction layer over SP. I treat SP as a service and talk to it on my terms, not theirs. Second, I have a second set of integration tests that really a not that important to the main thrust of the TDD work I do and that’s of testing the SP interactions. There’s two levels of testing here, a) does my code retrieve what I want from SP and give me the results I want and b) does SP do what I think it should. It’s like doing a write-read confirmation of a SQL call vs. making sure SQL is setup correctly and the tables/fields are there.

From Peter at 2/18/2009 9:12 PM

one of the things Bob Martin mentioned was that you can’t test all your code. You have a core set of logic that you test heavily, hopefully getting 100% coverage of this CORE logic, and a volatile set of code, which you shove away in an untested cubbyhole. In my mind, our small SharePoint projects are almost all volatile code, with very little “core logic.”

In Designing solutions for microsoft sharepoint 2010 they recommend:

Testability. Can you test your classes in isolation? If your code is tightly coupled to a user interface, or relies on specific types, this can be challenging.
Flexibility. Can you update or replace dependencies without editing and recompiling your code?
Configuration. How do you manage configuration settings for your solution? Will your approach scale out to an enterprise-scale deployment environment?
Logging and exception handling. How do you log exceptions and trace information in the enterprise environment? Is your approach consistent with that of other developers on the team? Are you providing system administrators with reliable information that they can use to diagnose problems effectively?
Maintainability. How easy is it to maintain your code in a code base that is constantly evolving? Do you have to rewrite your code if a dependent class is updated or replaced?

Different types of tests

Probably the best SharePoint based explanation is in Designing solutions for microsoft sharepoint 2010. Here I will reproduce in full the explanations of unit and integration testing then the final part which backs up my experience that you can’t do integration testing using MSTest (although this is apparently fixed in Visual Studio 2010 SP1 Beta):

Unit Testing
Unit tests are automated procedures that verify whether an isolated piece of code behaves as expected in response to a specific input. Unit tests are usually created by developers and are typically written against public methods and interfaces. Each unit test should focus on testing a single aspect of the code under test; therefore, it should generally not contain any branching logic. In test-driven development scenarios, you create unit tests before you code a particular method. You can run the unit tests repeatedly as you add code to the method. Your task is complete when your code passes all of its unit tests.
A unit test isolates the code under test from all external dependencies, such as external APIs, systems, and services. There are various patterns and tools you can use to ensure that your classes and methods can be isolated in this way, and they are discussed later in this section.
Unit tests should verify that the code under test responds as expected to both normal and exceptional conditions. Unit tests can also provide a way to test responses to error conditions that are hard to generate on demand in real systems, such as hardware failures and out-of-memory exceptions. Because unit tests are isolated from external dependencies, they run very quickly; it is typical for a large suite consisting of hundreds of unit tests to run in a matter of seconds. The speed of execution is critical when you are using an iterative approach to development, because the developer should run the test suite on a regular basis during the development process.
Unit tests make it easier to exercise all code paths in branching logic. They do this by simulating conditions that are difficult to produce on real systems in order to drive all paths through the code. This leads to fewer production bugs, which are often costly to the business in terms of the resulting downtime, instability, and the effort required to create, test, and apply production patches.

Integration Testing
While unit tests verify the functionality of a piece of code in isolation, integration tests verify the functionality of a piece of code against a target system or platform. Just like unit tests, integration tests are automated procedures that run within a testing framework. Although comprehensive unit testing verifies that your code behaves as expected in isolation, you still need to ensure that your code behaves as expected in its target environment, and that the external systems on which your code depends behave as anticipated. That is the role of integration testing.
Unlike a unit test, an integration test executes all code in the call path for each method under test-regardless of whether that code is within the class you are testing or is part of an external API. Because of this, it typically takes longer to set up the test conditions for an integration test. For example, you may need to create users and groups or add lists and list items. Integration tests also take considerably longer to run. However, unlike unit tests, integration tests do not rely on assumptions about the behavior of external systems and services. As a result, integration tests may detect bugs that are missed by unit tests.
Developers often use integration tests to verify that external dependencies, such as Web services, behave as expected, or to test code with a heavy reliance on external dependencies that cannot be factored out. Testers often also develop and use integration tests for more diverse scenarios, such as security testing and stress testing.
In many cases, organizations do not distinguish between integration and unit testing, because both types of tests are typically driven by unit testing frameworks such as nUnit, xUnit, and Microsoft Visual Studio Unit Test. Organizations that employ agile development practices do, however, make this distinction, since the two types of tests have different purposes within the agile process.

Note: In the Visual Studio 2010 release, there is a limitation that prevents you from running integration tests against SharePoint assemblies using Visual Studio Unit Test. Unit tests created for Visual Studio Unit Test must be developed using the Microsoft . NET Framework 4.0 in Visual Studio 2010, whereas SharePoint 2010 assemblies are based on .NET Framework 3.5. In many cases, this is not an issue-.NET Framework 4.0 assemblies are generally compatible with .NET Framework 3.5 assemblies, so you can run a .NET Framework 4.0 test against a .NET Framework 3.5 assembly. However, the way in which SharePoint loads the .NET common language runtime (CLR) prevents the runtime from properly loading and running the tests within Visual Studio Unit Test.
This limitation prevents you from running integration tests with SharePoint within Visual Studio Unit Test. Integration tests execute real SharePoint API logic instead of substituting the logic with a test implementation. Two isolation tools discussed in the following sections, TypeMock and Moles, will continue to work because they intercept calls to the SharePoint API before the actual SharePoint logic is invoked. You can execute integration tests using a third-party framework such as xUnit or nUnit. Coded user interface (UI) tests against SharePoint applications will run without any issues from within Visual Studio 2010.

Against Mocking:

From Eric Shupps at 2/18/2009 7:30 PM

While I agree that mocked objects do have value, I believe that the overall value is much less in an environment like SharePoint. You absolutely cannot duplicate the functionality of the core system, no matter how hard you try – reflection only takes you so far. Most of what Isolator does has no bearing on what actually happens in a live system. Fake objects are fine in tightly controlled environments but they just don’t do the trick in SharePoint. I do, however, encourage you to keep innovating and producing better and better tools for the development community.

For Mocking via IOC:

From Paul at 2/19/2009 12:11 AM

Let me give a real code example. Take this:

private Product[] GetProductsOnSpecialFromList(string url, string listName)
       var results = new List<Product>();
       var webProvider = IOC.Resolve<IWebSiteProvider>();
       var list = webProvider.FindWeb(url).FindList(listName);
       foreach (var item in list)
              // Add an item to products list
              // Only include products in the InStock state
              // Only include products which are on special
              // More and more business rules
       return results;

What do we have here? Business logic. That’s the kind of stuff I want to do TDD against. But it talks to SharePoint? Well, it talks to an interface. Behind the interface could be SharePoint, or it could be a mock object. I don’t care. It’s called “loose coupling”. Can I test it? Absolutely. And it’s dead easy. I can test all kinds of permutations on my business logic. In fact, I can even tell what happens when a URL is invalid or a site is offline – my mock IWebProvider would thrown an exception, and I’d test that my business logic responded appropriately. Now, eventually, behind that interface, you might want to test SharePoint. For that, you could make a decision – do I write an integration test, or do I not care? Either way, that code should only account for 5% of your code base. You can do TDD for the other 95%. In short, I suspect you find testing hard because you write tightly-coupled, hard to maintain code that mixes business logic/behavior with SharePoint crap. If you wrote loosly coupled code, you’d find testing easy.




General discussion on TDD around SharePoint

Sample Code:

Categories: Uncategorized Tags: , ,