Wednesday, April 28, 2010

Check if full-text search is enabled

Recently I needed to check whether full-text search functionality is installed and enabled on a Microsoft SQL Server 2008 database.

An easy way to do this is by using the FullTextServiceProperty property. FullTextServiceProperty is a T-SQL function which return information about full-text service installed on the related Microsoft SQL Server instance. You can use FullTextServiceProperty t-sql function especially to check whether fulltext search component is installed on the current SQL Server instance and get the status of the full-text service.

FullTextServiceProperty function uses the property name as an input parameter.

There are 2 properties available that can be used with fulltextserviceproperty function:

  • IsFullTextInstalled : If you pass this property as an input to the FullTextServiceProperty the returned value will be an indicator whether full-text component is installed on the related instance of Microsoft SQL Server instance. Possible return values are 1, 0 and NULL values.
  • If FullTextServiceProperty function combined with IsFullTextInstalled parameter returns 1, this indicated that Full-text is installed. If return value is 0 then full-text service is not installed on the SQL Server.

A sample:

   1:  SELECT FullTextServiceProperty('IsFullTextInstalled')

NHibernate and Linq

For the people who never look at the NHibernate trunk, since December 2009 a new and improved Linq provider is available.

Using the new Linq provider is pretty simple. It all hangs of a Query() extension method on ISession, so you can do things like the following:

   1:  from c in session.Query<Customer>() select c

I tried it out for some time and every query I could imagine worked.

Great work!

Subscribe to events using reflection

·When you use reflection to load and run assemblies, you cannot use language features like the C# += operator or the Visual Basic AddHandler statement to hook up events. This MSDN document ( shows how to link an existing method to an event by getting all the necessary types through reflection and how to create a dynamic method using reflection emit and link it to an event.

A sample:

   1:          // Load an assembly, for example using the Assembly.Load
   2:          // method. In this case, the executing assembly is loaded, to
   3:          // keep the demonstration simple.
   4:          //
   5:          Assembly assem = Assembly.GetExecutingAssembly();
   7:          // Get the type that is to be loaded, and create an instance 
   8:          // of it. Activator.CreateInstance has other overloads, if
   9:          // the type lacks a default constructor. The new instance
  10:          // is stored as type Object, to maintain the fiction that 
  11:          // nothing is known about the assembly. (Note that you can
  12:          // get the types in an assembly without knowing their names
  13:          // in advance.)
  14:          //
  15:          Type tExForm = assem.GetType("ExampleForm");
  16:          Object exFormAsObj = Activator.CreateInstance(tExForm);
  18:          // Get an EventInfo representing the Click event, and get the
  19:          // type of delegate that handles the event.
  20:          //
  21:          EventInfo evClick = tExForm.GetEvent("Click");
  22:          Type tDelegate = evClick.EventHandlerType;
  24:          // If you already have a method with the correct signature,
  25:          // you can simply get a MethodInfo for it. 
  26:          //
  27:          MethodInfo miHandler = 
  28:              typeof(Example).GetMethod("LuckyHandler", 
  29:                  BindingFlags.NonPublic | BindingFlags.Instance);
  31:          // Create an instance of the delegate. Using the overloads
  32:          // of CreateDelegate that take MethodInfo is recommended.
  33:          //
  34:          Delegate d = Delegate.CreateDelegate(tDelegate, this, miHandler);
  36:          // Get the "add" accessor of the event and invoke it late-
  37:          // bound, passing in the delegate instance. This is equivalent
  38:          // to using the += operator in C#, or AddHandler in Visual
  39:          // Basic. The instance on which the "add" accessor is invoked
  40:          // is the form; the arguments must be passed as an array.
  41:          //
  42:          MethodInfo addHandler = evClick.GetAddMethod();
  43:          Object[] addHandlerArgs = { d };
  44:          addHandler.Invoke(exFormAsObj, addHandlerArgs);

Some related links:

Structuring your workitems using the Scrum for Team System Template

If you are using the great Scrum For Team System template inside Visual Studio 2010, there are some important things to remember.

First of all, try to use the included ScrumMaster WorkBench tool. This makes managing releases, sprints, sprint backlog items, … a lot easier. If you are not using this tool, it’s important to know that the template is really peculiar on how you structure your iterations and link them to workitems. If you do this incorrectly, you end up with empty reports.

To make sure that everything works correctly, structure your workitems like this:

  • Release Work Items (Level 2)
    Must be associated to iteration path nodes at the level two branch depth.
  • Sprint Work Items (Level 4)
    Must be association to iteration path nodes at the forth level depth.
  • Team Sprint Work Items (Level 5)
    Must be association to iteration path nodes at the fifth level depth

Wednesday, April 21, 2010

Enterprise Library 5 Released

The long waited release of Enterprise Library 5 is available on the Codeplex site.
Although the current feature list looks not that different from previous versions, a massive refactoring is done on all the Application Blocks to work with Unity and 
dropping the old ObjectBuilder library. Also the configuration tool is now implemented in WPF.

Another nice addition is the new  programmatic configuration support via fluent interface which make writing configuration in code more intuitive. This is especially useful if you want to get rid of the massive amount of XML configuration and start introducing some Convention over Configuration.

You can download Enterprise Library 5 from here.

Friday, April 16, 2010

MSDN Live Meeting - VSTS 2010 Update

On Tuesday, April 27, I’m giving a live webcast about Managing Projects with Microsoft Visual Studio Team System 2010.

During this webcast, you will discover that combining Visual Studio Team Foundation Server with Visual Studio, you can apply proven practices to manage your application's lifecycle, from understanding customer needs through code design and implementation to deployment. You can use the instrumentation in these tools to trace requirements to checked-in code, builds and test results.

More information and registration here:

Changing workitem types in TFS 2010

In TFS 2008 you could export and import workitem types using the witimport and witexport tool. In TFS2010 these tools are gone and replaced by the witadmin tool.



Viewing and changing Excel Reports in TFS 2010

One of the nice new features in TFS 2010 are the Excel Reports. They give you an easy to change view on top of the TFS warehouse. But you need to have the necessary rights before you can start playing with these Excel Reports.

  • Viewing an Excel report:
    • Member of the Team Foundation Valid Users security group.
  • Edit an Excel report:
    • Member of the TfsWarehouseDataReader security role in SQL. Server Analysis Services. (More info here)
    • Contributor permissions in SharePoint Products for the team project.

Create (less ugly) HTML documents from Word

In earlier versions of Microsoft Word, the Save as HTML command created basic HTML documents based on your Word formatting commands.

Microsoft provides an HTML filter that you can download, install, and use alone or with Microsoft Word to strip Office-specific codes from HTML documents, creating much cleaner coding. Word 2002 and later incorporate this filter as an option by default; from the File menu (or Office Button menu in Word 2007), select Save As... to save your document. Then, under "Save as type:", select Web Page, Filtered.

Refresh the TFS 2010 Warehouse

By default the TFS warehouse is rebuild every hour. You can however change this default behaviour. You can either change the interval of the warehouse refreshment or you can refresh the cube manually. I knew how to do this on our TFS 2008 server, but the procedure has changed on the TFS 2010 environment.

Change the interval

To change the interval follow the following steps

1. Login on the Application Tier

2. Browse to http://servername:8080/tfs/TeamFoundation/Administration/v3.0/WarehouseControlService.asmx . You get a list of all available web services.

3. Click on the ChangeSetting webservice

4. You will see a new page where you can enter the setting and its new value. Enter in the SettingId: RunIntervalSeconds and in the newValue the number of seconds

5. Click on Invoke to change the setting

Manually refresh the cube

You can also choose to refresh the cube once. To do that execute the following steps.

1. Open the list of web services again with http://servername:8080/tfs/TeamFoundation/Administration/v3.0/WarehouseControlService.asmx

2. Select the ProcessWarehouse web service from the list

3. Enter in the collectionName the name of the Project Collection you want to refresh. The default collection is named DefaultCollection, but you can change that in the collection that is in use on your TFS environment.

4. Click on Invoke

5. Go back to the list of webservices and click on the ProcessAnalysisDatabase webservice, and enter in the processType textbox the value Full.

6. Click on Invoke

7. The data warehouse is now refreshed. This can take some time when you have a large database. To check the status of the warehouse update, you can check the GetProcessingStatus webservice. You can leave the parameters for the webservice blank to retrieve all information. The warehouse update is ready when all jobs have the “Idle” job status.

MSBuild Explorer

Browsing through your MSBuild files can be a painful experience. Last week I found the MSBuild Explorer tool that can make this job a lot easier.

MSBuild Explorer provides the following main areas of functionality.

  • Exploring MSBuild Files; explore the makeup of your MSBuild files, showing all properties, item groups, imports and targets which are colour coded to indicate whether they are Initial, Default or both. Targets are shown in a Treeview with DependsOnTargets as subnodes.
  • Favourites; allows you to save the way you execute MSBuild files.
  • Quick Run; allows you to quickly edit and run MSBuild scripts.
  • New MSBuild File; creates a new file based on the template your have configured.

Check it out!

Start external program

To debug a Visual Studio Addin I configured the Addin project to start a new instance of Visual Studio when I run the addin inside VS.

  • Open the project properties
  • Go to the Debug tab.
  • In the Start Action part, select the Start external program option and browse to devenv.exe.

After checking my code into Team Foundation Server, I noticed that another developer had to reconfigure this setting. The reason is that this setting goes into your user options file.

I got bored having to re-set this value again and again. So I searched for a solution.  Turns out that this setting goes to a file named after your project file plus the ".user" extension. This file is just a fragment of an MSBuild file, and would look something like:

   1:  <?xml version="1.0" encoding="utf-8"?> 
   2:  <Project ToolsVersion="4.0" xmlns=""> 
   3:     <PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'Debug|AnyCPU'"> 
   4:        <StartAction>Program</StartAction> 
   5:        <StartProgram>C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common\IDE\devenv.exe</StartProgram> 
   6:        <StartArguments>/rootSuffix Exp</StartArguments> 
   7:     </PropertyGroup> 
   8:  </Project>

So what you can do is copy the entire PropertyGroup to your main project file, delete this .user, and check-in your change.


It’s always fun when you get a challenging requirement. Today I had to implement a feature to create readonly word documents. First I was looking at using the Protect Document features to implement this functionality.


But a colleague found that there is a simpler option available. You can mark a document as final. Doing this using  VSTO is nothing more then setting the Document.Final property to true. However, our journey wasn’t complete yet, because once you change this property to true a dialog box is shown to the user and I couldn’t get rid of it.

So in the end, I changed my implementation to use OpenXML and solved it by adding a CustomPropertiesPart to the WordDocument.

Tuesday, April 13, 2010

Changing the workitem list type in Excel

By default if you export a workitem query result to Excel, you always get back a flat list.  Execute following steps to change this flat list to a tree list:

  1. On the Team tab in Office Excel, in the Work Items group, click Add Tree Level.

  2. In the Convert to Tree List dialog box, click Parent-Child or other type of link (only tree link types are listed), and then click Yes.

Team Foundation changes the list structure to a tree list.

TFS 2010 Work Item Link Queries

To extend our ALM offering, we are building some custom tools that interact with TFS 2010 work item tracking via the client object model

I noticed one very strange thing while implementing some query features. When querying for links, we specify "FROM WorkItemLinks" in the WIQL and then use Query.RunLinkQuery() to get the results. The results consist of an array of WorkItemLinkInfo objects. These contain only the link type, and the source and target work item IDs. So specifying anything other than "SELECT [System.Id], [System.Links.LinkType]" has no effect.

But if we look at the built-in link based queries in VS2010 and examined their WIQL, there are always tons of fields in the SELECT clause even though they specify "WorkItemLinks" in their FROM clause. If I run one of those queries from the client object model those fields aren’t returned.

I’m  speculating that Team Explorer  does some of kind of parsing that converts the WorkItemLinks query to a combination of WorkItems and WorkItemLink queries.

Anyone who knows if I’m correct?

Migrate data between Azman stores

At a couple customers we are using the Microsoft Authorization Manager (Azman). This gives us great flexibility in managing the security aspect of our applications. But one very important feature is missing: the possibility to migrate data between Azman stores.  Follow this link to learn more about AzMan.

While there is no production tool meant to provide this, a sample application azmigrate.exe is available in the Windows SDK. This tool can export the AzMan store allowing migrations from AD to XML or from XML to AD.

To get azmigrate.exe you must download the Windows SDK and once it is installed you need to:

1) Compile the AzMigrate project using Visual Studio:

a. Select File > Open > Project/Solution.

b. Change the Solutions Configurations drop-down in the toolbar from Debug to Release. (Don’t forget to do this. I got some compilation errors in Debug mode.)

c. Select AzMigrate.vcproj located under %programfiles%\Microsoft SDKs\Windows\v7.0\Samples\Security\Authorization\AzMan\AzMigrate.


2) In Visual Studio select Build > Build Solution, then click Build > Build AzMigrate and get the compiled binary from %Program Files%\Microsoft SDKs\Windows\v6.1\Samples\Security\Authorization\AzMan\AzMigrate\Win32\Release.

Once you’ve compiled azmigrate.exe, here’s the syntax needed to export the AzMan store from Active Directory to an XML file in the C:\Temp folder:

AzMigrate.exe "msxml://C:/ Temp/AzmanStore.xml" "msldap://,CN=Program Data,DC=contoso,DC=com" /o /l=C:\temp\export.log /v

The exported XML file can now be loaded in azman.msc directly from the XML file or you can import the file to another Active Directory domain where the Domain Function Level has been raised to Windows Server 2003 by running using this command:

AzMigrate.exe "msldap://,CN=Program Data,DC=child,DC=contoso,DC=com" "msxml://C:/ Temp/AzmanStore.xml" /o /l=C:\temp\import.log /v

My TechDays 2010 Presentations are available

Two of my presentations are available for download.

Friday, April 9, 2010

ASP.NET MVC 2 Diagnostics

After installing the ASP.NET MVC 2 bits, I also took a look at the MVC Futures library. This library contains a bunch of features that might be included in a future version of ASP.NET MVC.  One of the things I found in there was a file called MvcDiagnostics.aspx.

I guessed that it would give me some diagnostics features but I had no idea how to get this file working. Until I found this blog post where Brad Wilson explains us how this file can help us getting some diagnostic information about ASP.NET MVC:

Uploading large files in IIS 7

If you are using IIS 7 and you try to upload a file that is too big you will receive this error:

The request filtering module is configured to deny a request that exceeds the request content length.

In IIS 6 you could manipulate the request length by changing the maxRequestLength in the web.config.

   1:  <system.web>
   2:     <httpRuntime maxRequestLength="1000000"/>
   3:  </system.web>

To get the request length raised in IIS7, you have to set the following value in the web.config:

   1:  <system.webServer>
   2:     <security>
   3:        <requestFiltering>
   4:           <requestLimits maxAllowedContentLength="1000000000">
   5:           </requestLimits>
   6:        </requestFiltering>
   7:     </security>
   8:  </system.webServer>

Becoming a RESTafarian

Lately I’ve been playing around a lot with REST and the REST toolkit for WCF.

One very good series of blog posts to get you started:

Step by step guide: Install certificate Services on Windows Server 2008 R2

As I keep forgetting how to setup this stuff, I’m really glad I found this blog post:

This blog also contains a lot of other interesting step by step guides. One to remember!

Extreme Programming in a nutshell

J.D. Meier succeeds in giving a brief overview of the activities, artifacts, practices,… in Extreme Programming:

NoSQL: Getting Started with MongoDB and NoRM

As the mainstream .NET world is finally finding it’s way to Object Relational Mapping, the Alt.NET guys are already looking at the next logical step: document databases.

Howard van Rooijen wrote a very good introduction to help you take your first steps to a NoSQL world:

As always don’t forget; “The right tool for the right job…”

Problems configuring Team Foundation Server 2010 to use a Fully Qualified Domain Name

As transparancy is important for us, we expose our Team Foundation Server environment to our customers. Of course we use a public domain name to let our customers access the TFS environment.

First I configured our DNS Server internally to ensure that the FQDN points to an internal IP Address when users are connected to the network (to avoid having to route out to the internet and back). Then for external access, I added an A record for the FQDN to the DNS zone file for our domain.

Unfortunately when I opened the web access, the links  to the sharepoint portal and the report website were still pointing to the internal server name.

So I opened the very nice TFS Admin Console, selected the  Application Tier, then clicked on Change URLS. I changed the Public Url field  to the FQDN and let the Server Url continue to use the machine name.

Next I selected Application Tier > SharePoint Web Applications, then selected the top row in the list within the SharePoint Web Applications list box. By clicking 'Change SharePoint Web Application' and putting the FQDN in the Friendly Name and Web Application Url fields our Sharepoint configuration was fine.

At last I selected Application Tier > Reporting and clicked Edit. I updated the  the Warehouse Database, Analysis Services Database, Report Server Url and Report Manager url so that they also were using the FQDN.

After this was completed I restarted all services. But when I tried to connect to Team Foundation Server everything failed!

It took me a lot of time to figure out that I had to disable the Loop Back Check on the TFS Server. This is required otherwise the TFS Admin Console will throw errors as it won't be able to access your FQDN.

After a server restart I was finally good to go!

What’s new in WPF 4?

As Microsoft is putting all it’s energy into Silverlight, little is left for it’s (little) brother WPF. Still there are some important improvements that make WPF 4 a lot more useful than the previous version.

For an overview of the changes, have a look at this MSDN article:

Passing parameters to a Reporting Services Report

Today I had to call a Reporting Services Report by passing the parameters using the query string. As I was not sure if this was possible, I looked on SQL Server Books Online where I found this article confirming that it was possible to pass report parameters by including them in the url.

The only thing that's not super clear about that article is that when you just navigate to the server you have set up running SSRS it probably looks something like this:

If you just put the parameters on that URL it doesn't work.
You have to change the URL to be like this:

Then you can add on URL parameters like this:
http://myserver/ReportServer?/Reports/OrdersByCustomer &CustomerID=ALFKI

Saturday, April 3, 2010

Nice Quote

"Every system is perfectly designed to get the results it gets."

- Dr Paul Batalden