Wednesday, December 29, 2010

TF266044: One or more machines are not ready to run workflows.

When configuring a new lab environment for a client, I encountered the following error:

Environment message: Type=Error; Message=TF266044: One or more machines are not ready to run workflows. For more information, see the individual machine errors.;

Machine messages:

Machine name: XXXXXX

Machine message: Type=Error; Message=Error occurred while configuring TFSBuildServiceHost with Lab configuration. ExceptionType:Microsoft.TeamFoundation.Build.Client.BuildServiceHostAlreadyExistsException. ExceptionMessage: A build service host already exists for computer Specify a different computer name and try again.

I found the solution in the following blog post ( Although our TFSenvironment wasn’t upgraded from the Beta 2, the fix still works if you encounter this error message.

Tuesday, December 28, 2010

Export mapping files generated by Fluent NHibernate

If you are using  NHibernate, you probably also heard about Fluent NHibernate which allows you to create your object-relational mapping from a mix of conventions and code instead of using error-prone XML files. Although you’re using a completely different API under the hood the same old XML files still exist. Sometimes it would be handy to view the NHibernate mapping files generated by Fluent NHibernate.It turns out that this is quite easy and you will just need to add a call to the method ExportTo() and then the mappings will be created in the location specified.

   1:  private static ISessionFactory CreateSessionFactory() 
   2:  { 
   3:     string outputdir= Path.Combine(System.Environment.CurrentDirectory, "Mappings"); 
   5:      return Fluently.Configure() 
   6:        .Database(SQLiteConfiguration.Standard.InMemory().ShowSql()) 
   7:        .Mappings(M => M.FluentMappings.AddFromAssemblyOf<SessionFactory>() 
   8:        .ExportTo(outputdir))
   9:        .BuildSessionFactory(); 
  10:  }

Remark: Make sure that the directory exists where you want to output the generated mapping files.

Monday, December 27, 2010

Distributing check-in policies across your team

One of the nicest features of Team Foundation Server are the check-in policies. They allow you to validate specific rules before each check-in. What makes this even nicer is that you can easily write your own check-in policies. However there is one problem, the check-in policy(read: DLL) needs to be installed on the developer machine. This is important because if developers do not install check-in policies they would not be run during a Check-in.

Before you start inventing the most complex deployment strategies to get the latest version of these policies on the developer machines, note that there is already a feature available, included in the TFS power tools.

After installing power tools, a new node is added in Team Explorer called Team Members. Right click in the Team Members node and choose Personal Settings.

From there you can see all the options for the collaboration features and one interesting option called Install downloaded custom components.

This extension verifies in the source control the presence of a path called $projectname/TeamProjectConfig/CheckinPolicies and inside that folder looks for 1.0 2.0 or 3.0 folders for VS2005 VS2008 and VS2010 addins respectively. Every dll found on that directory is automatically downloaded to the right location and made available to client computers.

Remark: You can also include dll’s containing custom controls for work item editing, they should be included in a folder called $projectname/TeamProjectConfig/CustomControls.

Sunday, December 26, 2010

Impress your colleagues with your knowledge about… PDB files

Most developers know that PDB files help you in some way with debugging, but that's about it. They are a dark art for most developers only completely understand by a few evil magicians. Let me help you understand what PDB files are and how they can help you making your debugging experience a lot easier. First read the following 3 important rules and never forget them!

Rule 1 – PDB files are as important as source code

First and foremost PDB files are as important as source code! Debugging bugs on a production server without finding the matching PDB files for the deployed build can cost you tons of money. Without the matching PDB files you just made your debugging challenge nearly impossible.

Rule 2 – As a development shop, I should have a Symbol Server

At a minimum, every development shop must set up a Symbol Server. A Symbol Server stores the PDBs and binaries for all your public builds. That way no matter what build someone reports a crash or problem, you have the exact matching PDB file for that public build the debugger can access. Both Visual Studio and WinDBG know how to access Symbol Servers and if the binary is from a public build, the debugger will get the matching PDB file automatically.

Rule 3 – A Source Server is a Symbol Server best friend

A Symbol Server is not that useful without one extra step. That step is to run the Source Server tools across your public PDB files, which is called source indexing. The indexing embeds the version control commands to pull the exact source file used in that particular public build. Thus, when you are debugging that public build you never have to worry about finding the source file for that build. If you are using TFS 2010, out of the box the Build server will have the build task for Source Indexing and Symbol Server copying as part of your build enabled.

Now you know these 3 rules, let’s have a look at the PDB file itself. A .NET PDB only contains two pieces of information, the source file names and their lines and the local variable names. All the other information is already in the .NET metadata so there is no need to duplicate the same information in a PDB file.

When you load a module into the process address space, the debugger uses two pieces of information to find the matching PDB file. The first is obviously the name of the file. If you load ABC.DLL, the debugger looks for ABC.PDB. The extremely important part is how the debugger knows this is the exact matching PDB file for this binary. That's done through a GUID that's embedded in both the PDB file and the binary. If the GUID does not match, no debugging at source code level is possible.

With the knowledge of how the debugger determines the correctly matching PDB file, the last question that remains is where the debugger looks for the PDB files. You can see all of this order loading yourself by looking at the Visual Studio Modules window, Symbol File column when debugging. The first place searched is the directory where the binary was loaded. If the PDB file is not there, the second place the debugger looks is the hard coded build directory embedded in the Debug Directories in the PE file. If the PDB file is not in the first two locations, and a Symbol Server is set up for the on the machine, the debugger looks in the Symbol Server cache directory. Finally, if the debugger does not find the PDB file in the Symbol Server cache directory, it looks in the Symbol Server itself.

I hope this information helped you understand PDB files and hopefully you start understanding and using their full potential.

Friday, December 24, 2010

WCF vNext: Linq to WCF

One of the great new features coming to WCF vNext is the ability to expose a service through an IQueryable interface. This introduces the rich query model of OData to your own WCF services. How does this work?

Making the service queryable

On the server side, your service operation should return an IQueryable<T>. Annotate the operation with the new [QueryComposition] attribute. Once you do that, your service becomes queryable using the OData uri format.

   1:  [WebGet(UriTemplate = "")]
   2:  [QueryComposition]
   3:  public IQueryable<Customer> Get()
   4:  {   
   5:     return customers.AsQueryable();
   6:  }

The Get method above returns an IQueryable of customers. With the query composition enabled, the host will now accept requests like “http://localhost/customers?$filter=Countrye%20eq%20Belgium” which says “find me all the customers from Belgium”.

Querying the service, LINQ to WCF

On the client side Microsoft added a CreateQuery<T> extension method which you can use with the new HttpClient to create a WebQuery<T>. Once you have that query, you can then apply a Where, or an Order by. Once you start to iterate through the result, we will automatically do a Get request to the server using the correct URI based on the filter. The results will come back properly ordered and filtered based on your query.

Below is a snippet that shows querying our previously created Customer resource:

   1:  public IEnumerable<Customer> GetBelgianCustomers()
   2:  {    
   3:     var address = "http://localhost/customers";    
   4:     var client = new HttpClient(address);    
   5:     var customers = client.CreateQuery<Customer>();
   7:     return customers
   8:                   .Where(c=>c.Country == "Belgium")
   9:                   .OrderBy(c=>c.CustomerName);    
  10:  }

Thursday, December 23, 2010

Boost the performance of your ASP.NET and WCF applications

By default both IIS and WCF are somewhat restrictive in their default settings. Executing a large number of concurrent call’s will not have much effect as by default only 2 concurrent connections per IP are allowed.

However there are some simple configuration changes that you can make on machine.config and IIS to give your web applications significant performance boost. These are simple harmless changes but makes a lot of difference in terms of scalability.

To learn how to do this, read this article “Quick ways to boost performance and scalability of ASP.NET, WCF and Desktop Clients” written by Omar Al Zabir.

Wednesday, December 22, 2010

ThreadStatic and ThreadLocal<T>

For a long time I was using the Thread­Sta­tic attribute to make the value of a sta­tic or instance field local to a thread (i.e. each thread holds an inde­pen­dent copy of the field). Although this did the trick for a long time, the ThreadStatic attribute had some disadvantages:

  • the Thread­Sta­tic attribute doesn’t work with instance fields, it com­piles and runs but does nothing..
  • fields always start with the default value

With the release of C#  4 Microsoft intro­duced a new class specif­i­cally for the thread-local stor­age of data – the ThreadLocal<T> class:

   2:  ThreadLocal<int> _localField = new ThreadLocal<int>(() => 1); 

So why should you choose the ThreadLocal<T> class?

  • Thanks to the use of a factory function, the val­ues are lazily eval­u­ated, the fac­tory func­tion only executes on the first call for each thread
  • you have more con­trol over the ini­tial­iza­tion of the field and you are able to ini­tial­ize the field with a non-default value

Tuesday, December 21, 2010

How to be a bad programmer?

People always learn the most from their mistakes. So when talking about what defines a good programmer versus what defines a bad programmer, it’s sometimes interesting and easier to discuss what makes someone a bad programmer instead of a good one.

Giorgio Sironi wrote a great article about “How to be a worse programmer?”

A must read!


Monday, December 20, 2010

Silverlight 5: Microsoft’s answer on the Silverlight is dead discussion

The last few weeks, there was a lot of buzz around the future of Silverlight. Although there were some official comments, the rumors kept going. For the remaining skeptics,  what can be a better answer than the announcement of Silverlight 5.

A t the Silverlight FireStarter event Microsoft announced the timeline for Silverlight 5 in 2011.  Silverlight 5 was the main subject of  Scott Guthrie’s keynote where Microsoft demoed many of the coming new features and capabilities.  Silverlight 5 will be in beta the first half of 2011 and ship early in the second half of 2011.

Some of the impressive improvements(note especially XAML debugging):

Silverlight 5 Media improvements:

  • Hardware Decode & Presentation of H.264 performance improvements using GPU support
  • Trickplay with fast-forward and rewind support w/normal audio pitch
  • Improved power awareness
  • Remote-control support
  • Digital Rights Management advancements

Application Development improvements:

  • Smoother UI experiences with smoother animation
  • Text improvements
    • Multi-column text & linked container text
    • Text clarity improved
    • OpenType support enhanced
  • Support for Postscript vector printing
  • Added support for double-click and combobox
  • MVVM and Databinding enhancements
  • Networking and WCF enhancements
    • Reduced network latency using a background thread
    • WS-Trust support
  • Performance Improvements
    • XAML parser improvements
    • Support for 64-bit OSes
  • Graphics Improvements
    • GPU API
    • Direct rendering on GPU
    • Hardware acceleration on Internet Explorer 9
  • New class of trusted applications
    • Host HTML content as a browser control
    • Read/Write to users My Documents folder
    • Launch Microsoft Office and other programs
    • Ability to call into application COM components gaining access to system capabilities and devices
    • Full keyboard support in full screen
    • Call unmanaged code with PInvoke
  • Out-of-browser trusted applications enhancements
    • Call unmanaged code with PInvoke
    • Child Windows support
  • Tool improvements
    • Visual Studio profiling support for CPU, memory, thread contention
    • Visual Studio Team Test support

Sunday, December 19, 2010

Hostname can't support more than 1 level subdomain.

Last week I wanted to test some new Windows Azure Servicebus functionality. So I started by creating a simple WCF service to host on the cloud. After configuring my service settings in the web.config, I started the service and was confronted with the following error message:

Hostname can't support more than 1 level subdomain.

It took me some time to figure out the root cause of the problem. I had created a namespace on After checking with Fiddler what was going on I realized that although I was using the appfabriclabs environment, the authentication was still passing on to with the error message above as a consequence.

After creating a service namespace through the application ran successfully.

Saturday, December 18, 2010

Using XML namespaces in WPF

When referencing controls from another assembly in XAML, you probably use the xmlns:myAlias="clr-namespace:MyNamespace;assembly=MyAssembly" syntax.

Last week I discovered you also have another option, you can use an Uri instead of a namespace reference thanks to the XmlnsDefinition attribute. (Read more about this attribute on MSDN). It allows you to map a XAML namespace to one ore more assembly namespaces.

So how do you use it?

  1. Open the AssemblyInfo.cs file under the Properties folder of your project.
  2. Add the following line for each namespace in your assembly you want to map:
   1:  [assembly: AssemblyTitle("WPF Namespace Sample")]
   2:  [assembly: AssemblyDescription("")]
   3:  [assembly: AssemblyConfiguration("")]
   4:  [assembly: AssemblyCompany("")]
   5:  [assembly: AssemblyProduct("WPF Namespace Sample")]
   6:  [assembly: AssemblyCopyright("Copyright ©  2010")]
   7:  [assembly: AssemblyTrademark("")]
   8:  [assembly: AssemblyCulture("")]
   9:  [assembly: XmlnsDefinition("http://myApp/schemas/2010/xaml", "WPF.Samples.Controls")]
  10:  [assembly: XmlnsDefinition("http://myApp/schemas/2010/xaml", "WPF.Samples.Commands")]

You can use this reference then inside another project:

   1:  <UserControl    
   2:  xmlns=""   
   3:  xmlns:x=""   
   4:  xmlns:wpfSample="http://myApp/schemas/2010/xaml"     >

Friday, December 17, 2010

Build times for TFS Team Build

After creating an application to monitor builds over multiple Team Projects,  the customer came back with a second request. As they had the feeling that some builds took a long time to complete they asked us to update the application to include build timings.

A colleague took my application and extended it with some extra code. Full code below:

   1:  class Program
   2:  {
   3:     static void Main(string[] args)
   4:     {
   5:        // The url to the tfs server 
   6:        Uri tfsUri = new Uri("<TFS URL>");
   7:        TfsTeamProjectCollection tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(tfsUri);
   8:        IBuildServer bs = tfs.GetService<IBuildServer>();
   9:        WriteQueuedBuilds(bs);
  10:        Console.ReadLine();
  11:     }
  13:     private static void WriteQueuedBuilds(IBuildServer bs)
  14:     {
  15:        IQueuedBuildSpec qbSpec;
  16:        IQueuedBuildQueryResult qbResults;
  17:        qbSpec = bs.CreateBuildQueueSpec("*", "*");
  18:        qbSpec.CompletedWindow = TimeSpan.FromDays(25);
  19:        qbResults = bs.QueryQueuedBuilds(qbSpec);
  21:        Console.WriteLine("Queued Builds");
  22:        Debug.WriteLine("Queued Builds");
  24:        foreach (IQueuedBuild qb in qbResults.QueuedBuilds.OrderByDescending(a=> a.QueueTime))
  25:        {
  26:           string status = qb.Status.ToString();
  27:           string def; 
  28:           if (qb.BuildDefinition!= null)
  29:              def = qb.TeamProject + @"\" + qb.BuildDefinition.Name;
  30:           else if (qb.Build != null)
  31:              def = qb.TeamProject + @"\" + qb.Build.BuildDefinition.Name;
  32:           else
  33:              def = qb.TeamProject + @"\<unknown>";
  35:           string pri = qb.Priority.ToString();
  36:           string datequeued = qb.QueueTime.ToString();
  37:           string requestedBy = qb.RequestedBy;
  38:           string buildDetails = string.Empty;
  39:           string finishTime = string.Empty;
  40:           string starttime = string.Empty;
  41:           if (qb.Build != null)
  42:          {
  43:              if (qb.Build.BuildFinished)
  44:              {
  45:                 buildDetails = "finished " + qb.Build.FinishTime.Subtract(qb.QueueTime).TotalMinutes + " minutes after queue";
  46:                 finishTime = qb.Build.FinishTime.ToString();
  47:              }
  48:              starttime = qb.Build.StartTime.ToString();
  49:           }
  50:           string controller = qb.BuildController.Name;
  52:           if (qb.RequestedBy != qb.RequestedFor)
  53:           {
  54:              requestedBy = qb.RequestedBy + " (for " + qb.RequestedFor + ")";
  55:           }
  57:           Console.WriteLine("{0} {1} {2} {3} {4} {5} {8}", controller, status, def, pri, datequeued, requestedBy, starttime, finishTime, buildDetails);
  58:           Debug.WriteLine("{0}\t{1}\t{2}\t{3}\t{4}\t{5}\t{6}\t{7}\t{8}", controller, status, def, pri, datequeued, requestedBy, starttime, finishTime, buildDetails);
  59:        }
  60:     }
  61:  }

Thursday, December 16, 2010

Windows Azure Platform 30 Day Pass

Microsoft launched a new offer to get started with Windows Azure: the Windows Azure platform 30 day pass. Great news is that no credit card  is required. The only thing you need to do is click on the following link and use the Promo Code to get going:

The Windows Azure platform 30 day pass includes the following resources:

Windows Azure

  • 4 small compute instances
  • 3GB of storage
  • 250,000 storage transactions

SQL Azure

  • Two 1GB Web Edition database


  • 100,000 Access Control transactions
  • 2 Service Bus connections
  • Data Transfers (per region)
  • 3 GB in
  • 3 GB out

Have a cloudy day!

Wednesday, December 15, 2010

Visual Studio 2010 SP1 Beta released

Last week Microsoft released a BETA  version of Visual Studio 2010 Service Pack 1. Although a lot of blog posts mentioned the fact that the beta was released, I didn’t find a lot of information about the exact content and features that will be released in Service pack 1.

So here is an aggregation of some blog posts which together give you a full overview of all the goodness that’s coming:

Tuesday, December 14, 2010

Great Windows Phone 7 productivity story

As a .NET developer by day, choosing for a Windows Phone seems more obvious than choosing an IPhone or Android phone. However if you don’t use Microsoft products every day, the choice is a lot harder.

So if you are a developer planning to create applications on one of the mobile platforms, certainly read this great story comparing development productivity between iOS, Android, WP7 and mobile Web:

Definitely a great proof of the power of the Windows Phone 7 development experience!

Monday, December 13, 2010

New Visual Studio 2010/ TFS 2010 VPC’s available

Most of the time if I have to do a demo about Team Foundation Server I use our own TFS (Test) environment. However some clients want to see some specific functionality or don’t have Internet access available for me. In that case I fall back to the standard TFS 2010 demo VPC’s that Microsoft provides us.

Last week Microsoft released a new version of these VPC’s. This new version contains the latest feature packs, power tools, and Windows Updates. This refreshed VM will stop working on June 1, 2011.

What’s new in this version?

  • Visual Studio 2010 Feature Pack 2
  • Team Foundation Server 2010 Power Tools (September 2010 Release)
  • Visual Studio 2010 Productivity Power Tools · Test Scribe for Microsoft Test Manager
  • Visual Studio Scrum 1.0 Process Template
  • All Windows Updates through December 8, 2010
  • Lab Management GDR (KB983578)
  • Visual Studio 2010 Feature Pack 2 pre-requisite hotfix (KB2403277)
  • Microsoft Test Manager hotfix (KB2387011)
  • Minor fit-and-finish fixes based on customer feedback

Please note that this VM does not include Visual Studio Lab Management 2010 capabilities. The Lab Management team has released a separate VHD which has this capability.

Download links:

Microsoft® Visual Studio® 2010 and Team Foundation Server® 2010 RTM virtual machine for Windows Virtual PC

Microsoft® Visual Studio® 2010 and Team Foundation Server® 2010 RTM virtual machine for Windows Server 2008 Hyper-V

Microsoft® Visual Studio® 2010 and Team Foundation Server® 2010 RTM virtual machine for Microsoft® Virtual PC 2007 SP1

Sunday, December 12, 2010

Prevent hanging build from blocking your TFS build server

Last week I had a question from a customer who complained that all builds on their build server blocked because of one build that sometimes fails. Now the reason why this build failed is something for another blog post. In this post I’ll focus on the way how to prevent that a hanging build keeps blocking your build server.


If you open up your build definition, go to the Process tab and expand the Advanced node, you’ll find the Agent Settings node. If you further expand this node you see that you can  specify the following parameters:

Maximum Execution Time

Type a time span value in hh:mm:ss format. For example, the build will fail with a time-out error if you specify a value of 04:30:15 and the build agent has not completed its work after 4 hours, 30 minutes, and 15 seconds. Specify a value of 00:00:00 if you want to give the build agent unlimited time to process the build. (By default this value is 00:00:00 and this is the reason why a build can keep blocking your build agent).

Maximum Wait Time

Type a time span value in hh:mm:ss format. For example, the build will fail with a time-out error if you specify a value of 01:30:45 and the build has not been assigned to a build agent after 1 hour, 30 minutes, and 45 seconds. Specify a value of 00:00:00 if you want to give the build controller unlimited time to find a build agent to process this build definition. (By default this value is 00:00:00 and this is the reason why a build can keep blocking your build agent).

These 2 settings together define the total amount of time one specific build may take.

Saturday, December 11, 2010

Saving NHibernate objects with assigned id’s

One of the great features of Nhibernate is that it manages persistance for us. You just attach an object to the session and NHibernate will figure out if this object is added or changed. But how does NHibernate knows the difference between a new and existing object?

By default it uses the value we assigned to the unsaved-value attribute on the id mapping. This means that if the id of our object is equal to our unsaved-value that NHibernate will detect this object as new and do an INSERT statement. If the id value is different from our unsaved-value NHibernate will generate an UPDATE statement instead.


   1:  <hibernate-mapping default-cascade="none" xmlns="urn:nhibernate-mapping-2.2">
   2:    <class name="Test.Data.Domain.Category, Test.Data" table="Categories" lazy="true">
   3:      <id name="CategoryID" type="System.Int32" column="CategoryID" unsaved-value="0">
   4:        <generator class="native" />
   5:      </id>
   6:    </class>
   7:  </hibernate-mapping>

Sounds easy but what if you are using a composite key? In that case using the unsaved-value makes no sense. If we have a look at the documentation NHibernate gives us a second option:

A version or timestamp property should never be null for a detached instance, so Hibernate will detact any instance with a null version or timestamp as transient, no matter what other unsaved-value strategies are specified. Declaring a nullable version or timestamp property is an easy way to avoid any problems with transitive reattachment in Hibernate, especially useful for people using assigned identifiers or composite keys!

So if you leave your version column empty, NHibernate will always detect the object as new.

Friday, December 10, 2010

Impress your colleagues with your knowledge about…the volatile keyword

Sometimes when working with C# you discover some hidden gems. Some of them very useful, other ones a little bit harder to find a good way to benefit from their functionality. One of those hidden gems that I discovered some time ago is the volatile keyword.

The volatile keyword indicates that a field might be modified by multiple threads that are executing at the same time. Fields that are declared volatile are not subject to compiler optimizations that assume access by a single thread. This ensures that the most up-to-date value is present in the field at all times.

The volatile modifier is usually used for a field that is accessed by multiple threads without using the lock statement to serialize access.

The following example demonstrates how an auxiliary or worker thread can be created and used to perform processing in parallel with that of the primary thread.


   1:  using System;
   2:  using System.Threading;
   4:  public class Worker
   5:  {
   6:      // This method is called when the thread is started.
   7:      public void DoWork()
   8:      {
   9:          while (!_shouldStop)
  10:          {
  11:              Console.WriteLine("Worker thread: working...");
  12:          }
  13:          Console.WriteLine("Worker thread: terminating gracefully.");
  14:      }
  15:      public void RequestStop()
  16:      {
  17:          _shouldStop = true;
  18:      }
  19:      // Keyword volatile is used as a hint to the compiler that this data
  20:      // member is accessed by multiple threads.
  21:      private volatile bool _shouldStop;
  22:  }
  24:  public class WorkerThreadExample
  25:  {
  26:      static void Main()
  27:      {
  28:          // Create the worker thread object. This does not start the thread.
  29:          Worker workerObject = new Worker();
  30:          Thread workerThread = new Thread(workerObject.DoWork);
  32:          // Start the worker thread.
  33:          workerThread.Start();
  34:          Console.WriteLine("Main thread: starting worker thread...");
  36:          // Loop until the worker thread activates.
  37:          while (!workerThread.IsAlive) ;
  39:          // Put the main thread to sleep for 1 millisecond to
  40:          // allow the worker thread to do some work.
  41:          Thread.Sleep(1);
  43:          // Request that the worker thread stop itself.
  44:          workerObject.RequestStop();
  46:          // Use the Thread.Join method to block the current thread 
  47:          // until the object's thread terminates.
  48:          workerThread.Join();
  49:          Console.WriteLine("Main thread: worker thread has terminated.");
  50:      }
  51:      // Sample output:
  52:      // Main thread: starting worker thread...
  53:      // Worker thread: working...
  54:      // Worker thread: working...
  55:      // Worker thread: working...
  56:      // Worker thread: working...
  57:      // Worker thread: working...
  58:      // Worker thread: working...
  59:      // Worker thread: terminating gracefully.
  60:      // Main thread: worker thread has terminated.
  61:  }

For more information:

Thursday, December 9, 2010

Software can not be manufactured

As developers we all agree with the title of this post. Still, a lot of desperate managers and business owners keep pretending that software development is a manufacturing process at heart.


Requirements specifications are created by analysts, architects turn these specifications into a high-level technical vision. Designers fill out the architecture with detailed design documentation, which is handed to robot-like coders, who sleepily type in the design’s implementation. Finally, the quality inspector  receives the completed code, which doesn’t receive her stamp of approval unless it meets the original specifications. This sounds an awful lot like the typical waterfall methodology if you ask me!

It is no wonder that managers want software development to be like manufacturing. Managers understand how to make manufacturing work, we all do. We have decades of experience in how to build physical objects efficiently and accurately. So, applying what we’ve learned from manufacturing, we should be able to optimize the software development process into the well-tuned engine that our manufacturing plants have become.

Unfortunately, the manufacturing analogy doesn’t work. Things change in business, and businesspeople know that software is soft and can be changed to meet those changing requirements. This means architecture, designs, code, and tests must all be created and revised in a fashion more agile than the leanest manufacturing processes can provide. And their we have the magic word, “Agile”. In today’s rapidly changing environment, flexilibity is key and is only achievable through agile processes.

What do you think?

Wednesday, December 8, 2010

Deleting Team Foundation Server Team Projects

If you want to remove a team project from Team Foundation Server when the project is no longer required, you can use the TFSDeleteProject command line tool. This tool can also be used if there are components that remain undeleted after an unsuccessful team project creation.

TFSDeleteproject [/q] [/force] [/excludewss] /collection:URL TeamProjectName

You can find the TFSDeleteProject command-line tool in Drive:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE on any client computer that runs Team Explorer.

Remark: TFSDeleteProject permanently destroys the team project, after which it cannot be recovered. You should backup all important project data before using TFSDeleteProject.

Tuesday, December 7, 2010

31 days of Windows Phone

Interested in the Windows Phone 7? But you don’t know where to start?

First download the free Windows Phone 7 developer tools. The following is installed with the download:

  • Visual Studio 2010 Express for Windows Phone – Free edition of VS 2010 for Phone development.
  • Express Blend 4 for Windows Phone – Free version of Blend for Windows Phone 7 Development.
  • Silverlight for Windows Phone 7 – Rich framework for building great applications for Windows Phone 7.
  • XNA Game Studio for Windows Phone 7 Rich framework that enables you to build great 2D and 3D games for Windows Phone 7.
  • Windows Phone Emulator – A hardware accelerated emulator that allows you to run and debug your applications and games without requiring a phone.
  • Phone Registration Tool – When you get a device, this allows you to “unlock” the device so you can run/debug your application on it, using your Marketplace account.

Afterwards read the 31 Days of Windows Phone 7 blog series by Jeff Blankenburgs. And then it’s all up to your creativity!


Monday, December 6, 2010

NHibernate 3.0 GA released

nhibernate The GA(General Availability=Final) version of NHibernate 3.0 got released yesterday. Go get it!

Most important improvements are the ability to use lambda expressions and a full-blown LINQ provider. Plans for version 3.1 include additional bug fixes and patches, as well as enhancements for the new LINQ provider.

Free e-books for .NET programmers

“There is not such a thing as a free lunch.”

Most of the time this is true, but sometimes you find a lot of information for free!

I noticed this blog post by Anoop Madhusdanan where he mentions 7 freely available e-books for .NET programmers and architects.

If you need a (cheap) gift under the Christmas tree, you’ve find it :-)


Sunday, December 5, 2010

Visual Studio 2010 Extensions: Colored Console Application Template

With the release of Visual Studio 2010, creating and finding extensions became a lot easier thanks to the build in Extension Manager.

One of the great extensions I discovered is the Colored Console Application Template.


Creating a Console application in Visual Studio was always easy. But the standard Console Application project is rather … empty. The moment you needed a mature console application, you probably started adding features like:

  • Functional style command line parser
  • Console Coloring
  • Help

With the Colored Console Application Template you no longer have to add these features yourself. You’ll get them out of the box for free including a lot of other features.


You can download the template directly from Visual Studio through the Extension Manager or go here and install it yourself.

Saturday, December 4, 2010

Error adding test case, there is no test with specified id.

Last week I was testing the Coded UI Test feature of Visual Studio 2010. But when I ran the test in an lab management virtual environment, the test returned an error. In the error log I found the following error message:

Error adding test case [xx] to test run: There is no test with specified Id {xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx}.

I found out that the error was caused by the fact that I was running the test against a build that didn’t contain the coded UI Test associated with the Test case.  The problem was that I scheduled a new test run in the Microsoft Test manager without changing the build.

To solve the error I queued a build, verified that the build succeeded, then changed the build associated with the test plan. So be sure that your test plan is always up to date with the correct build.

Friday, December 3, 2010

Visual Studio 2010 Feature Pack 2

Visual Studio 2010 Feature Pack 2 is now available to MSDN subscribers. This introduces some new testing features inside Visual Studio 2010:

  • Use Microsoft Test Manager to capture and playback action recordings for Silverlight 4 applications.
  • Create coded UI tests for Silverlight 4 applications with Visual Studio 2010 Premium or Visual Studio 2010 Ultimate..
  • Edit coded UI tests using a graphical editor with Visual Studio 2010 Premium or Visual Studio 2010 Ultimate.
  • Use action recordings to fast forward through manual tests that need to support Mozilla Firefox 3.5 and 3.6.
  • Run coded UI tests for web applications using Mozilla Firefox 3.5 and 3.6 with Microsoft Visual Studio 2010 Premium or Visual Studio 2010 Ultimate.

For more details, see

Microsoft also created some videos on the new Testing features in Visual Studio 2010 Feature Pack 2:

Thursday, December 2, 2010

Check-in Policy override feature

In Team Foundation Server, you have the concept of check-in policies. This feature allows you to define a set of checks that have to succeed before a developer can check-in it’s code. But a developer can always override these check-in policies and still do a check-in.

A lot of customers ask if they can block the user from overriding the policy. To state it clear: you cannot disable this feature. However you can get alerts when someone overrides the policies:

  • Click Team --> Alerts Explorer.
  • Add a CheckinEvent.
  • Set Alert Definition to PolicyOverrideComment<>''

Wednesday, December 1, 2010

Debugging SQL with SQL Management Studio

I think we all agree that testing stored procedures and functions on a database tier can be time consuming, they are hard to debug and sometimes just difficult to get clarity on what is “happening”.  To help you understand what’s going on, you can use the built-in debugging features of Microsoft SQL Management Studio. These allow you to see exactly what is going on, and step through your logic in a similar fashion as in Visual Studio.

To get you going check this post(Debugging SQL Queries, Functions, & Stored Procedures with SQL Management Studio’s Integrated Debugger) by Doug Rathbone.

Tuesday, November 30, 2010

Windows Azure SDK updated

Today the Azure team released SDK 1.3 and the updated tools for Visual Studio.

Download the SDK here:

New for version 1.3:

  • Virtual Machine (VM) Role (Beta):Allows you to create a custom VHD image using Windows Server 2008 R2 and host it in the cloud.
  • Remote Desktop Access: Enables connecting to individual service instances using a Remote Desktop client.
  • Full IIS Support in a Web role: Enables hosting Windows Azure web roles in a IIS hosting environment.
  • Elevated Privileges: Enables performing tasks with elevated privileges within a service instance.
  • Virtual Network (CTP): Enables support for Windows Azure Connect, which provides IP-level connectivity between on-premises and Windows Azure resources.
  • Diagnostics: Enhancements to Windows Azure Diagnostics enable collection of diagnostics data in more error conditions.
  • Networking Enhancements: Enables roles to restrict inter-role traffic, fixed ports on InputEndpoints.
  • Performance Improvement: Significant performance improvement local machine deployment.

    Lab management deployment script arguments

    Lab management 2010 allows you to automate the build-deploy-test workflow. By following a simple wizard, you can easily create a build of your code, run some deployment scripts to deploy the build on a test environment and run some tests inside this environment and all this by just triggering a new build inside Visual Studio.

    The deployment scripts being run as part of the workflow can be anything from a batch file to an MSI or a power shell script. There are a few built-in arguments available that can be passed to these scripts:


    Argument Description
    $(BuildLocation) Refers to the location from where the binaries will be picked up for deployment. It depends on the option that is chosen in the "Build" page in the "Lab Workflow parameters" wizard. If "TeamBuild" option is chosen, then the drop location of the concerned build will be referred to. If a build location is specified explicitly, then that location will be used.
    $(InternalComputerName_<VMName>) Refers to the bare host name of the specific virtual machine inside the environment.
    Example : Consider an environment by name "NightlyBuild" with virtual machines by name "Webserver" and "DBServer".To refer to the computer name of the "WebServer" virtual machine, specify $(InternalComputerName_WebServer).
    $(ComputerName_<VMName>) Similar to $(InternalComputerName _VMName) macro. But this will expand to FQDN name instead of bare host name.

    Monday, November 29, 2010

    Change the placement policy of your System Center Virtual Machine Manager configuration

    Last week a colleague was creating a new test environment using the Team Foundation Server Lab Management features when we encountered following error:

    Error when creating a new virtual machine on <machinename>:

    TF259115: Team Foundation Server could not find any suitable host to deploy the virtual machine: <machinename>.

    Contact your administrator to fix the issues on the hosts below. (Hosts are listed in brackets)

    Memory requirement of the virtual machine(s) exceeds the available host resources. The placement policy for this host group is set to be conservative and hence virtual machines that are in stopped state are also accounted as consuming host resources. Your administrator can change this policy by running the TfsLabConfig tool.

    As our lab management environment has a large set of virtual machines who are in a stopped state, we ran out fast of memory resources. We can change this behavior to exclude the non-running machines from memory usage. Therefore log on to your Team Foundation application server and run the following command from the command line:

    TfsConfig.exe lab /settings /collectionName:defaultCollection /list

    This command lists your current configuration settings for your lab environment.You will see that it is using a conservative placement policy. Run the following command to change the policy:

    TfsConfig.exe lab /settings /collectionName:defaultCollection /hostGroup /edit /name:"All hosts" /labenvironmentplacementpolicy:aggressive

    Sunday, November 28, 2010

    Slow connection when connecting to Team Foundation Server from Visual Studio 2010

    At a client we were testing our newest line of development machines. After installing Visual Studio 2010 we noticed that the performance was VERY slow when connecting to Team Foundation Server (2010). Performing a get latest of a large solution took several hours(!).

    Some investigation showed that it does not appear to be a TFS issue per se but rather something lower in the .NET Framework stack, having to do with automatic proxy discovery through the WPAD protocol.

    You can work around the issue by turning off proxy autodiscovery on a per-application basis by using a .exe.config setting.

    The .exe.config for Visual Studio is %VSINSTALLDIR%\Common7\IDE\devenv.exe.config and it already contains a section in the .exe.config for System.Net. After adding the defaultProxy element, that section would look like this:

       1:  <>
       2:     <defaultProxy enabled="false" />
       3:     <settings>
       4:        <ipv6 enabled="true"/>
       5:     </settings>
       6:  </>

    Issue solved!

    Saturday, November 27, 2010

    WPF and Winforms “airspace problem”: a solution

    In a previous post, I talked about the “airspace problem”, an issue you can encounter when you are using WPF and WinForms in the same application. I created a small sample application that shows the issue and I’ve also added a possible solution.

    If you run the application, you’ll see the following user interface:


    This userinterface contains the following important elements:

    • A WPF Grid control that is not visible by default
    • A WPF button
    • A WindowsFormsHost control containing a usercontrol with a calendar control inside

    When you click the button, the WPF Grid will be made visible. You see that the winformshost control is still shown on top of the grid(recognizable by the brown border on the left and the bottom.


    Now try again but first select the Solve Airspace problem checkbox. This time the control is correctly shown on top of the winformshost.


    Download the code and the sample application here.

    Friday, November 26, 2010

    Learn Python the hard way

    They always say that to be a good programmer, you have to learn a new programming language every year.  I am still finding my way in the Ruby world, but for people who are looking for the next challenge, I can recommend “Learn Python The Hard way”, a free e-book for people who want to learn Python. homepage

    It is really, really(!) basic but I found it useful to get the syntax in my fingers without having to think too much about more complex programming problems.

    Thursday, November 25, 2010

    Build fails after upgrading to Visual Studio 2010

    After upgrading our solution and projects to Visual Studio 2010, our builds started to fail with the following error:

    C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\Microsoft.Common.targets (1917,9):
    errorMSB3086: Task could not find "LC.exe" using the SdkToolsPath "" or the registry key "HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v7.0A".
    Make sure the SdkToolsPath is set and the tool exists in the correct processor specific location under the SdkToolsPath and that the Microsoft Windows SDK is installed

    For a reason I don’t know it expects the 7.0A SDK. And indeed on the build server this SDK was not installed. With that knowledge I copied the directory "Program Files\Microsoft SDKs\Windows\v7.0A" from my development machine to my build server. I then exported the "HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v7.0A" registry key from my development machine and merged into the registry of the build server. Problem solved.

    Wednesday, November 24, 2010

    Sharepoint Schema version error

    At a client, the Team Foundation server 2010 environment was working fine until suddenly the Sharepoint stopped working. In the event log, I found the following error message:

    The schema version ( of the database SharePoint_AdminContent_d3718c5d-5516-41e1-ba7d-6e5a3c21ba75 on DatabaseServerX is not consistent with the expected database schema version ( on ApplicationServerY. Connections to this database from this server have been blocked to avoid data loss. Upgrade the web front end or the content database to ensure that these versions match.

    I found out that Windows Update had installed some patches which upgraded the Sharepoint application front end to a higher version. Unfortunately this made the front-end and the underlying database incompatible.

    I fixed the problem by executing the following commands on the application server:

       1:  stsadm -o upgrade -inplace -url <central admin url> 
       3:  iisreset

    Tuesday, November 23, 2010

    Team Foundation Server 2010 Extensibility

    A lot of customers ask me which parts of Team Foundation Server and the Team Explorer Client are extensible. And my answer is always the same: everything!

    To give you some ideas and examples, go to the  TFS SDK page and download some of the available samples. One of the coolest examples is the “Extending Team Projects sample”, it extends Team Projects and Team Explorer in TFS 2010 and includes a Project Creation Wizard plug-in and a Team Explorer plug-in.

    Project Creation Wizard

    This sample uses a custom process template that declares a special group ID that includes a task list from a separate XML file. That XML file includes a set of web links arranged in a folder structure. Those links are passed to the Project Creation Wizard plug-in as the new Team Project is getting created. The plug-in also adds a new page to the wizard that lets the user enter additional web links and, optionally, to add a link to the newly created Team Project in Web Access.


    This page derives from the TeamProjectWizardPage class which provides some handy methods to manage the state of the navigation buttons at the bottom of the wizard. The LinksProjectComponentCreator takes the links entered by the user, merges them with the links specified in the process template and stores an XML representation of the data in a property on the Team Project.

    Team Explorer

    Once your new Team Project has been created, you’ll see a new folder in Team Explorer called Links that shows the links that were specified in the sample process template as well as the links you entered when you went through the Project Creation Wizard.


    More information and a complete walkthrough here.

    Monday, November 22, 2010

    Windows Workflow Foundation 4: State Machines

    With the release of .NET Framework 4, Microsoft completely rewrote the Windows Workflow Foundation(WF) implementation. One of the consequences was that the original workflow types(state machine and sequential) were replaced by a new sequential and flowchart implementation leaving state machines out of the picture.

    With the Microsoft WF State Machine Activity Pack CTP 1 Microsoft reintroduces a state machine implementation based on Windows Workflow Foundation in .NET Framework 4 (WF 4). The implementation contains not only a state machine runtime but also a graphical state machine designer.

    It’s a really great and robust CTP for everyone who needs state machine workflows. Thank you, Microsoft.


    Windows Azure Connect

    A few weeks ago, I blogged about Project Sydney. In the meantime Microsoft has released a first CTP under it’s official name: Windows Azure Connect.

    “Windows Azure Connect provides a simple and easy-to-manage mechanism to setup IP-based network connectivity between on-premises and Windows Azure resources. This capability makes it easier for an organization to migrate their existing applications to the cloud by enabling direct IP-based network connectivity with their existing on-premises infrastructure. For example, a company can deploy a Windows Azure application that connects to an on-premises SQL Server database, or domain-join Windows Azure services to their Active Directory deployment. In addition, Windows Azure Connect makes it simple for developers to setup direct connectivity to their cloud-hosted virtual machines, enabling remote administration and troubleshooting using the same tools that they use for on-premises applications.”

    You can register for the CTP here. If you want to know more about it, check this great blog post:

    View builds across multiple projects

    One very annoying thing in Team Foundation Server 2010 is that you can only see the builds for 1 team project in Team Explorer. This is especially a problem when you are running many builds and you want to know which builds are already queued/running.

    Luckily this is really easy to create yourself using the TFS object model.  Create a simple console application in Visual Studio and add the following code:


       1:  using Microsoft.TeamFoundation.Client;
       2:  using Microsoft.TeamFoundation.Build.Client;
       4:  var tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://myTFS/MyCollection"));
       5:  var bs = tfs.GetService<IBuildServer>();
       6:  var qbSpec = bs.CreateBuildQueueSpec("*", "*");
       7:  var qbResults = bs.QueryQueuedBuilds(qbSpec);
       9:  foreach (var qb in qbResults.QueuedBuilds)
      10:  {
      11:      var status = qb.Status.ToString();
      12:      var def = qb.TeamProject + @"\" + qb.BuildDefinition.Name;
      13:      var pri = qb.Priority.ToString();
      14:      var datequeued = qb.QueueTime.ToString();
      15:      var requestedBy = qb.RequestedBy;
      17:      if (qb.RequestedBy != qb.RequestedFor)
      18:      {
      19:          requestedBy = qb.RequestedBy + " (for " + qb.RequestedFor + ")";
      20:      }
      21:      Console.WriteLine("{0} {1} {2} {3} {4}", status, def, pri, datequeued, requestedBy);
      22:  }

    Sunday, November 21, 2010

    Logging Visual Studio Actions

    As almost every piece of software Visual Studio behaves strangely from time to time, weird error messages, crashes,… I’ve seen it all. A first help to trace the source of the problem is to start Visual Studio with logging enabled. You can do this by starting Visual Studio with the /log switch:

    devenv.exe /log [filename]

    The [filename] is optional and if not specified will call the log ActivityLog.xml by default.  The path is to the file is:


    There is an XML stylesheet (XSL) comes with the XML data so if you view the file in your browser you will see this nice view:


    Saturday, November 20, 2010

    From developer to designer: Kuler

    Most developers are really great at writing code but if you ask them to create a user interface, they always come up with the same boring standard WinForms/WPF user interface. We all seem to suffer from a lack of aesthetic skills.

    To help you improve the look and feel of your windows/web application, I have one really nice tool to add to your toolbox: Kuler

    “Kuler is a web application for generating color themes that can inspire any project. No matter what you're creating, with Kuler you can experiment quickly with color variations and browse thousands of themes from the Kuler community”

    One of the features I really like is the ability to create a style theme from an image. If there is a specific painting, photo, … you like, it can be used a starting point for your own color theme. Try it yourself…


    Friday, November 19, 2010

    HTTP 403 when opening Sharepoint page

    When I tried to open the Sharepoint portal for a Team Foundation server 2010 Team project, I received the following 403 error:


    I had all the necessary rights and was connected to our domain, so I didn’t understand why this message was shown. It took me some time to find out that someone had changed the IIS server settings and changed the use of HTTPS and SSL to Required. Therefore I got an error when I connected to the website over HTTP. A more meaningful error message would have saved me a lot of time…

    Thursday, November 18, 2010

    Could not open a Silverlight project

    When I tried to open a Silverlight project inside Visual Studio 2010, it failed with the following error message:

    “The imported project "C:\Program Files\MSBuild\Microsoft\Expression\Blend\3.0\Silverlight\Microsoft.Expression.Blend.Silverlight.targets" was not found.”

    When I tried to open the project on another machine it worked fine. So I realized that it had to be an installation issue. After comparing the two systems I noticed that the Blend 4.0 SDK wasn’t installed on one machine. After installing the SDK everything worked perfect.

    Wednesday, November 17, 2010

    Enterprise Library 5 Fluent Configuration

    One of the big disadvantages of Enterprise Library was that the configuration was very XML centric. In Enterprise Library 5 a new fluent configuration API has been added which allows you to configure your Enterprise Library settings without writing ton’s of XML. The API makes configuring Enterprise Library very intuitive and easy to learn.

    In order to use the fluent configuration API, you need to create a ConfigurationSourceBuilder which is the main class to build a runtime configuration. Each feature in Enterprise Library, such as the application blocks for example, provides extension methods for this class which enables us to use the API in the same manner. Thanks to the use of extension methods this is very intuitive and easy. The ConfigurationSourceBuilder class is located in the Microsoft.Practices.EnterpriseLibrary.Common.Configuration DLL and you need to reference it. In order to use the fluent configuration extension methods for every application block, you need to add a reference to that application block’s DLL also.

    A sample:

       1:  var configBuilder = new ConfigurationSourceBuilder();
       2:  configBuilder.ConfigureData()
       3:           .ForDatabaseNamed("Northwind")
       4:             .ThatIs
       5:             .ASqlDatabase()
       6:             .WithConnectionString(ConnectionString)
       7:             .AsDefault();
       9:    var configSource = new DictionaryConfigurationSource();
      10:    configBuilder.UpdateConfigurationWithReplace(configSource);
      11:    EnterpriseLibraryContainer.Current = EnterpriseLibraryContainer.CreateDefaultContainer(configSource);

    Remark: If you already have an Enterprise Library configuration in your config file, you will be able to merge the configuration you created in runtime to it or update it.

    Tuesday, November 16, 2010

    The [TestCategory] attribute

    One of the lesser-known features added to .NET 4.0 is the TestCategory attribute. Comparable to the Category attribute in NUnit –this allows you to group your tests in functional categories. Using test categories is now the preferred way of running groups of tests, and you no longer need to deal with tests lists (.vsmdi files), which are tedious to maintain and very difficult to merge.

    You can use these test categories to optimize the build experience when using Team Build 2010. In the Build Definition there is a new option that allows you to include, exclude, combine,… multiple test categories in a build. Read more about it on MSDN.

    Monday, November 15, 2010

    Daddy, when I grow up I want to be a small basic developer


    15 years ago I wrote my first program in Turbo Pascal. Nowadays if you have children and want them to learn programming check out Microsoft Small Basic , which combines a simple (but powerful!) language and a rich set of libraries together with a friendly development environment.

    The Small Basic language draws its inspiration for an early version of BASIC but it is actually based on the .NET Framework. Like the early variants of BASIC it is based on, Small Basic is imperative and doesn't use or expose beginners to concepts like scopes, types, object orientation, etc.

    Even though it is based on the .NET Framework, it really is small and consists of just 14 keywords. In fact, there really isn’t a type system. You can create string and numeric constants and assign them to variables. Operations performed on these variables will be interpreted according to the content. All variables are global and are always initialized, so they can be used before they're assigned.

    It’s very easy to build simple games and even share them with your friends by using the Silverlight player. I couldn’t resist to write some Small Basic programs myself…