Monday, April 30, 2012

Using oData feeds in Windows 8

The oData integration in Windows 8 Metro-Style applications is not yet at the level of other kind of applications. But to get you going Microsoft released a preview version of the client libraries for Windows 8 Metro-Style.


How to use this version?

As the “Add Service Reference” feature for oData in Visual Studio 2012 is not available yet, you’ll have to generate the client
types manually.

  • Open a command prompt as administrator and navigate to %windir%\Microsoft.NET\Framework\v4.0.30128
  • Run the following command :
  • Afterwards you can create a Metro Style application and import the generated file.

More information here:

Friday, April 27, 2012

Compile C# code from your browser

You want to create a quick sample or test some C# code? You no longer have to open up Visual Studio, the only thing you need is a browser thanks to!
“ is an interactive web interface for the .NET compiler powered by the Roslyn Project. It allows you to quickly and easily execute snippets of C# code from your browser.You can use to learn C#, share and test code, reproduce bugs, explore fixes, and to demonstrate and collaborate on ideas. It's fast and simple. You can use it from any device* with a web browser, without installing any additional plugins. Use it when you don't have time for a full IDE!”

Thursday, April 26, 2012

Configuring the Scrum for Team System template correctly

I really like the Scrum for Team System template for Team Foundation Server. The only problem is that it’s kind of hard to configure correctly. Only of all parameters are set correctly, the template will work and the correct reports are generated.
To simplify my job as an TFS administrator, I customized the template a bit to get a head start. When you install this customized version, the correct iteration tree is created, some work items are added and the relationship between these work items is set. So no longer complaints by users who can not get this template working, and more time for me to do some useful stuff.
Until a developer(yes, I’m talking about you Dennis) decides immediately after the Team Project was created to remove all this preconfigured parts. Back to ‘No data available’ in all reports…
So just as a reminder to myself, what are the steps you need to take to configure the template correctly:
This is the expected iteration tree(Release -> Work Stream -> Sprint -> Team Sprint):
Planning Scope
Create the following work items and set the iteration path accordingly:
  • Release: Project\Release
  • Sprint: Project\Release\Work Stream\Sprint
  • Team Sprint: Project\Release\Work Stream\Sprint\Team Sprint
  • Sprint Backlog Items:Project\Release\Work Stream\Sprint\Team Sprint
Create the following links between these work items:
  • Release (implemented by ->) Sprint (implemented by ->) Team Sprint
More information here:

Wednesday, April 25, 2012

SQL Server 2012 Best Practices Analyzer

The Microsoft SQL Server 2012 BPA is a diagnostic tool that performs the following functions:
  • Gathers information about a Server and a Microsoft SQL Server 2012 instance installed on that Server
  • Determines if the configurations are set according to the recommended best practices
  • Reports on all configurations, indicating settings that differ from recommendations
  • Indicates potential problems in the installed instance of SQL Server
  • Recommends solutions to potential problems
Download it here:
Before you are able to run it, you need the following dependencies installed:

Tuesday, April 24, 2012

Unit testing Async code

.NET 4.5 introduces the async and await keywords allowing you to write asynchronous code in a synchronous matter.
But how do I test this asynchronous code? Do I have to add Thread.Sleep() and other kind of erroneous structures everywhere to be able to validate this code?
Visual Studio 11 makes this easy by supporting async testing out-of-the-box.  You can now include “async” keyword part of the test method, and the “await” keyword within the test method body.
A sample: 
public async Task TestAsyncFunctionality()
 var result= await GoOutAndCallMySlowWebservice();

That’s it!

WCF Data Services: Add service reference fails

While preparing some training material about WCF Data Services v5.0, I had the following problem:
After creating my dataservice, I tried to add a reference to it from my client project. I right clicked on the client project and choose ‘Add service reference…’. I clicked ‘Discover’ and there was my DataService in the list of available services. But when I tried to add the reference I got the following error message:
“The document at the url http://localhost:46400/OnlineShopService.svc/ was not recognized as a known document type.
The error message from each known type may help you fix the problem:
- Report from 'XML Schema' is 'The root element of a W3C XML Schema should be <schema> and its namespace should be ''.'.
- Report from 'DISCO Document' is 'Discovery document at the URL http://localhost:46400/OnlineShopService.svc/ could not be found.'.
  - The document format is not recognized.
- Report from 'WSDL Document' is 'There is an error in XML document (1, 40).'.
  - <service xmlns=''> was not expected.
Metadata contains a reference that cannot be resolved: 'http://localhost:46400/OnlineShopService.svc'.
The remote server returned an unexpected response: (405) Method Not Allowed.
The remote server returned an error: (405) Method Not Allowed.
If the service is defined in the current solution, try building the solution and adding the service reference again.”
So I browsed to the DataService metadata myself  (http://localhost:46400/OnlineShopService.svc/$metadata) where I got the following error back:
It seems that the WCF Data Service doesn’t like a namespace that starts with an underscore(‘_’). Changing the namespace to MyWebShop.AllAnySupport.Models instead of _1.AllAnySupport.Models solved the issue.

TF237124: Work Item is not ready to save

For a project I had to create some TFS Work Items programmatically using the TFS API. However saving the work item resulted in the following error message:
TF237124: Work Item is not ready to save
This error message doesn’t expose any useful information. What’s important to know is that you can configure a lot of rules before a work item is valid. If one of these rules is invalid, saving the work item will result in the above error message. Therefore it’s important to validate the WorkItem prior to save. The validate() method will return an arraylist of invalid fields.
ArrayList result= wi.Validate();

So what’s inside this arraylist? A quick sample:


This will give you all the information you need to know why this workitem cannot be saved.(In this case the field System.State has an InvalidListValue).

Thursday, April 19, 2012

Principles of Writing Consistent, Idiomatic JavaScript

Looking for some styling, coding guidelines for your JavaScript code?

Have a look at the Principles of Writing Consistent, Idiomatic JavaScript on GitHub.

The idea of this guideline is that:

“All code in any code-base should look like a single person typed it, no matter how many people contributed.”

Some other useful guides:

Anyone with some other good coding guidelines for JavaScript development?

Wednesday, April 18, 2012

TFS 11 Power Tools beta

Microsoft shipped a beta version of the Team Foundation Server Power Tools that work with VS 11 and are optimized to work with TFS 11.

This Power Tools release is designed to work with a VS 11 client (or Team Explorer 11) but you won’t get the VS integration in a VS 2010 or earlier client.

This beta gives you the most useful Power Tools up and running with VS 11/TFS 11. Some of the features of the previous Power Tools are removed because they have  been integrated into the product and At the moment no new features are added instead.

The features included in this release are:

  1. TFS Power Tools
    1. TFPT Command Line
    2. The Team Explorer extensions – All are included except work item templates.
    3. Windows Shell Extension – It’s been enhanced to support local workspaces meaning no more read-only files and need to checkout, better offline support and everything else that comes with local workspaces.
    4. Process Template Editor – We’ve added support for all of the new process features in TFS 11, however some of the implementation is place holder – for instance, editing the agile project management configuration involves editing XML.
    5. Best Practices Analyzer
    6. Test Attachment Cleaner
    7. TFS Powershell commands
  2. Build Extensions
  3. MSSCCI (32-bit) & MSSCCI (64-Bit)

Tuesday, April 17, 2012

WCF service error: This collection already contains an address with scheme https. There can be at most one address per scheme in this collection.

After moving some WCF services from one server to another, I was welcomed by the following error message:
"This collection already contains an address with scheme https. There can be at most one address per scheme in this collection."
“Googling” the error brought me a solution soon, but what caused this error message?
The solution
You can resolve this error by changing the web.config file.
With ASP.NET 4.0, add the following lines to your web.config:
 <serviceHostingEnvironment multipleSiteBindingsEnabled="true" />

With ASP.NET 2.0/3.0/3.5, add the following lines to your web.config:
   <add prefix=""/> 

The reason
IIS has web sites, which are containers for virtual applications which contain virtual directories. The application in a site can be accessed through one or more IIS binding.

IIS bindings provide two pieces of information – binding protocol and binding information. Binding protocol defines the scheme over which communication occurs, and binding information is the information used to access the site.

Binding protocol – HTTP
Binding Information – IPAddress , Port, Hostheader
The problem is that the IIS server was configured to support multiple bindings, which results in multiple base addresses per scheme. This brings WCF into trouble. A WCF service hosted under a site allows binding to only one baseAddress per scheme.

Monday, April 16, 2012

Adobe Shadow: Testing your mobile apps

Adobe Shadow



“Adobe® Shadow is a new inspection and preview tool that allows front-end web developers and designers to work faster and more efficiently by streamlining the preview process, making it easier to customize websites for mobile devices.”

  • Synchronized Browsing and Refreshing (Device Pairing) — Wirelessly pair multiple iOS and Android devices to your computer . With Shadow, you browse in Chrome, and all connected devices will stay in sync.
  • Remote inspect your code — Remote inspection and debugging capabilities. Target a device for remote inspection and using familiar development tools, make changes to your HTML, CSS and JavaScript and see your device update instantly.
  • Localhost URL support — Shadow now works correctly with URLs containing localhost, and On Mac OS X, it also works with machine.local.
  • Adobe Edge integration — If you are using Adobe Edge, the Preview in Browser command now works with Shadow. You will see your animations previewing in Chrome, and on all of your Shadow devices. Note: This requires Chrome to be the default browser.
  • HTTP Authentication support — Now you can browse to URLs that require HTTP authentication, and see the login/password form on Shadow devices. If your URL contains the login id/password, Shadow devices will authenticate without the form. (e.g., using a URL like:
  • Improved workflows for sticky caches — You can now use a Refresh gesture on Shadow devices (tap on the page, hold, and pull down until you see the "Release to refresh..." message). This will reload your page, using the freshest assets.
  • URL Monitoring — Pages/apps that change URL parameters or navigate to new states using '#' anchors now work correctly. Shadow monitors the Address Bar in Chrome, and sends updates to Shadow devices as they happen.
  • Amazon Kindle Fire support — Shadow is available in the Amazon Appstore for Android, and will be installed easily on Kindle Fire devices.

Remark: As you may notice, the Windows Phone isn’t supported(yet).

How does it work?

After installing the Adobe Shadow app on your mobile devices and on your local machine, you can browse to a web page on your laptop/desktop browser and have it automatically pushed to all of your devices without actually touching the device. The only caveat is that all your devices should be connected to the same (WiFi) network.

For the detailed instructions, check out the following blog post:

Friday, April 13, 2012

SQL Azure Tools: Azure User Management Console

The management options for SQL Azure are rather limited. For a lot of functionality, a friendly interface was missing. AUMC is a simple application that helps you managing the users and logins of an Azure SQL database.

Go check it out!


Thursday, April 12, 2012

NHibernate: DateTime support

By default, if you load a datetime value from the database through NHibernate, the DateTimeKind will be set to ‘Unspecified’. For a specific requirement on a recent project, the DateTimeKind needed to be set to ‘Local’ instead.
So I dived into NHibernate to discover a rich set of Date/Time related functionality. Let’s create a simple object:
public class SampleEntity 
 public DateTimeEntity() 
  CreatedOn = DateTime.Now;    

 public virtual Guid Id { get; private set; }    
 public virtual DateTime CreatedOn { get; set; }

If we save this entity to the database and load it again, we see there’s a difference:

Original entity:
Id: 9245fe4a-d402-451c-b9ed-9c1a04247482
CreatedOn: 2012-04-08 11:57:22 PM (Local)

Reloaded entity:
Id: 9245fe4a-d402-451c-b9ed-9c1a04247482
CreatedOn: 2012-04-08 11:57:22 PM (Unspecified)

When creating the entity, we initialized the CreatedOn property with DateTime.Now.(This is always the local time.) After saving and reloading the object to the database, the DateTimeKind is changed to ‘Unspecified’. This is because the database does not store whether the DateTime value is Local or UTC. NHibernate has no way of knowing which one it is, hence Unspecified.

To solve this NHibernate 3 includes two new DbTypes. In our mapping file, we can specify one of the following types:

<property name="CreatedOn" type="LocalDateTime"/>
<property name="CreatedOn" type="UtcDateTime"/>

By doing this we are explicitly telling NHibernate whether the database stores Local or UTC times.
For more information I recommend this blogpost by James Kovacs.

Wednesday, April 11, 2012

TFS Integration Tools – March 2012 Release

Last month Microsoft  released a new version of the TFS Integration Tools on the Visual Studio Gallery.(with support for both Team Foundation Server 11 and Team Foundation Service.)

“The TFS Integration Tools is a project developed by the Team Foundation Server (TFS) product group and the Visual Studio ALM Rangers to integrate Team Foundation Server with third party systems for migration and synchronization of data.”

The list of fixed bugs is long(I recognize too much of them Winking smile):

  • TFS authorization error syncing with a hosted TFS
  • When in multi session mode, integration tools appear to have a memory leak in TfsIntegrationService.exe
Version Control
  • ClearCaseDetailedHistoryAdapter with the SnapshotStartPoint option create folders in TFS for files in ClearCase
  • Cloaked filter pairs should always be applied after non-cloaked filter pairs or a migration to TFS VC will fail
  • Detail History Adapter runs the lshistory command one level higher in the tree than specified by the filter pair causing problems
  • Infinite loop when migrating a changeset that contains a cyclic rename
  • lshistory command fails when there is a space in the ClearCase filter pair path
  • Merge+Branch operations in sync resulted in VC data corruption
  • Moving some files under a deleted folder causes the files to not be deleted on the other side
  • Resolving VC namespace conflicts results in System.InvalidOperationException: Sequence contains no elements
  • TFS VC Adapters throw NullReferenceException in ProcessChangeGroup() after initialization of MigrationProvider fails
  • VC Conflict Detection does not compare all of the possibly conflicting change groups after an error prevents the sync from reaching a sync point
  • VC Migration/Sync: A runtime error occurs rather than a named conflict for all Exceptions thrown when checking into TFS
  • VC named conflicts should stop the VC session until the conflicts are resolved (and not just stop the current round trip)
  • Adapter doesn't deal with the DB2 date format returned by some ClearQuest 2003 APIs
  • CheckBypassRulesPermission always throws conflict when using a Hosted TFS service in a WIT session
  • ClearQuest items don't migrate from ClearQuest servers that are not configured to use UTC times
  • ClearQuestMigrationConfigGenerator.exe fails to start on 64bit - should be removed from release
  • Conflict types TFSCyclicLinkConflictType and TFSMulitpleParentLinkConflictType should both support skip resolution actions
  • CQ UserMappings that ships with TFS should work on out-of-the-box 2-way sync
  • ServerDiff Wit command does not work when the ClearQuest credentials are stored in the Windows credentials cache
  • TFS11->TFS11 AddWorkItemTest fails because WIT Server Diff says System.ChangedDate is different
  • WIT field mapping: Using multiple mapped values with "@@MISSINGFIELD@@" fails the configuration validation - it should be allowed
Internal Dogfooding
  • Dogfood Sync (VC): conflict detection is skipped in certain scenario causing content mismatch
  • Dogfood Sync (VC): VC sync blocked by transient TFS condition (bad gateway) that can generate a named conflict that must be resolved to continue
  • Dogfood Sync (WIT): After resolving a WIT Edit/Edit conflict newer revs of the accepted work item will be migrated before the conflict work item rev is migrated
  • Dogfood Sync (WIT): Basic conflict detection phase is extremely slow on Dogfood WIT sync
  • Dogfood Sync (WIT): Runtime error blocking WIT sync
  • Dogfood Sync (WIT): Work items changes backlogged on a conflict that occurred when migrating the mirrored work item are not unblocked when the conflict is resolved
  • Dogfood Sync: The Tfs2012ShellAdapter UI doesn't work correctly for configuration or conflict resolution
  • Dogfood: WITServerDiff command does not compare links properly and reports other expected differences
  • Dogfood: WITServerDiffJob should support the "ServerDiff WIT" command line options as configurable settings
  • Dogfood: WITServerDiffJob takes OutOfMemoryException

Tuesday, April 10, 2012

Running NCover on a 64bit machine

At a customer we are not using the Visual Studio Code Coverage feature to measure the unit test code coverage. Instead we are using NCover. As we are migrating from CruiseControl.NET to TFS Build, we installed NCover on our fresh new build server.

Running the build resulted not in an all green report but in the following error:


“System.DllNotFoundException: Unable to load DLL 'NCover.Lib.x64.dll': The specified module could not be found. (Exception from HRESULT: 0x8007007E)”


On the NCover forums, we found a solution for our problem. We accidently installed the x86 version of NCover on our 64bit build server.. Although it might work in some cases, it is not a supported configuration. So we removed the x86 version and did a clean install of the x64 version of NCover. This version includes both the 64-bit and 32-bit installations.

Friday, April 6, 2012

Check your website security using ASafaWeb

Troy Hunt, the creator of the great ebook about the OWASP Top 10 specifically targeted at .NET developers has built a great tool to check your website security: ASafaWeb.

What’s ASafaWeb?
“ASafaWeb is the Automated Security Analyser for ASP.NET Websites. The purpose of ASafaWeb is to make scanning for common configuration vulnerabilities in live ASP.NET websites dead easy. To that effect, you don't need anything more than a URL to get started and ASafaWeb will head off and report on anything it can find which is remotely detectable.”
Whilst this is an unequivocally a basic tool, it will still find configuration flaws in many web sites. The sort of flaws it finds are things like custom errors being off, YSODs with stack traces being returned, tracing still on, debug mode enabled and many, many more.
How does it work?
  • Enter the url of your application and click Scan. Can it be any easier? (I tried it with my employer's website Glimlach).
  • Once the scan has completed, you get a nice summary report and details about all the scanned parts, problems and possible ways to fix them. Nice!

Thursday, April 5, 2012

How to check if an MsBuild property exists?

I’m converting a Nant build script to MsBuild.
In Nant I have the following statement to check if a specific build property exists:
<if test="${property::exists('application.environment')}">

To do the same thing in MsBuild you have to use the condition attribute:

       <ApplicationEnvironment Condition="'$(ApplicationEnvironment)' == ''">UAT</ApplicationEnvironment> 

Wednesday, April 4, 2012

SOAP message properties are ignored in WCF

Last week, we discovered a strange bug inside WCF.
We have the following datacontract which represents a surface size expressed in Ha, A and Ca:
[DataContract(Namespace = Constants.SCHEMANAME + "Perceel")]
   public class SurfaceSize
       [DataMember(IsRequired = false)]
       public int? Ha { get; set; }
       [DataMember(IsRequired = false)]
       public int? A { get; set; }
       [DataMember(IsRequired = false)]
       public decimal? Ca { get; set; }

We noticed that only the Ha value was saved into the database. We looked at the NHibernate mapping files, the domain mapping code, the queries,… everything looked okay. So we took a look at the only place that was left, the message communication itself.

The following message data was send to us:

We discovered that changing the order of the parameters inside the message solved the problem:


No idea why WCF ignores the parameters when the order has changed. We are not using the Order attribute on our DataContracts.

Anyone who has a clue?

Tuesday, April 3, 2012

Running your build server on Azure(continued)

I talked about running your build server on Azure before. Although this solution worked, it was a cumbersome and time consuming process. You had to install, manage, patch, etc the build machines yourself.

Last week at VS Live, Brian Harry announced and demonstrated a better solution: a new cloud based build service for Team Foundation Service on Azure. With this new service, you can just use a pool of build machines that are manage in the cloud (though you can still install and manage build machines if you like). And, of course, you can do more than just build – like with on-premises TFS, you can run a default workflow that includes, compilation, testing, etc or you can create a custom workflow that does whatever you like.

The new build service works by maintaining a pool of Azure VM roles that can expand and shrink as needed. When you start a build, a VM is allocated from the pool to run your build. Your build is run, the build output is copied off the build machine then the VM is restored and it is returned back to the pool for someone else to use.

This new build service should be enabled for all new and existing accounts on Team Foundation Service.

You can create a new build definition, queue a build, etc the same way you would if you managed the infrastructure. The one difference is that you need to pick the “Hosted Build Controller” in your build definition rather than a local build controller. And (because we don’t have UNC shares in the cloud), you configure the build output (Drop folder) to be a path in version control.

To make it work inside Visual Studio 2010, you have to install the following patch:


More information and the official announcement:

Remark: While I was typing this blog post, the hosted build controller looked to be offline. So it seems there are still some issues(but of course it’s pre-release software).

Monday, April 2, 2012

Check-in policies in Team Explorer Everywhere

At a customer both the .NET and Java developers are using Team Foundation Server for Source Control, Work Item management, … To optimize the development process and enforce some quality checks, we use the available check-in policies. This forces the developers to check that their code compiles, the code analysis results are successful, a work item is selected, and so on…

We enable these check-in policies immediately after creating a new Team Project. So we were a little bit surprised when the Java developers came to us telling that the check-in policies were not applied when working inside RAD or Eclipse.

First we thought it was a bug in the Team Explorer Everywhere plugin. But in the end we discovered that the policies that you set in team explorer visual studio are not applied in the plugin of Eclipse.

See comment below from MS:

“Policies that you define by using Team Web Access or Team Explorer in Visual Studio are not applied when you check in by using the Team Foundation Server plug-in for Eclipse or the Cross-platform Command-Line Client for Team Foundation Server.”

So this means that by setting up the Java projects in Eclipse (RAD, Spring, …) you must also set up the policies there and not only in the Visual Studio Team Explorer.