Skip to main content

Posts

Showing posts from April, 2010

Check if full-text search is enabled

Recently I needed to check whether full-text search functionality is installed and enabled on a Microsoft SQL Server 2008 database . An easy way to do this is by using the FullTextServiceProperty property. FullTextServiceProperty is a T-SQL function which return information about full-text service installed on the related Microsoft SQL Server instance. You can use FullTextServiceProperty t-sql function especially to check whether fulltext search component is installed on the current SQL Server instance and get the status of the full-text service. FullTextServiceProperty function uses the property name as an input parameter. There are 2 properties available that can be used with fulltextserviceproperty function: IsFullTextInstalled : If you pass this property as an input to the FullTextServiceProperty the returned value will be an indicator whether full-text component is installed on the related instance of Microsoft SQL Server instance. Possible return values are 1, 0 and NULL

NHibernate and Linq

For the people who never look at the NHibernate trunk , since December 2009 a new and improved Linq provider is available. Using the new Linq provider is pretty simple. It all hangs of a Query() extension method on ISession, so you can do things like the following: 1: from c in session.Query<Customer>() select c I tried it out for some time and every query I could imagine worked. Great work!

Subscribe to events using reflection

·When you use reflection to load and run assemblies, you cannot use language features like the C# += operator or the Visual Basic AddHandler statement to hook up events. This MSDN document ( http://msdn.microsoft.com/en-us/library/ms228976.aspx ) shows how to link an existing method to an event by getting all the necessary types through reflection and how to create a dynamic method using reflection emit and link it to an event. A sample: 1: // Load an assembly, for example using the Assembly.Load 2: // method. In this case, the executing assembly is loaded, to 3: // keep the demonstration simple. 4: // 5: Assembly assem = Assembly.GetExecutingAssembly(); 6:   7: // Get the type that is to be loaded, and create an instance 8: // of it. Activator.CreateInstance has other overloads, if 9: // the type lacks a default constructor. The new instance 10: // is stored as type Object, t

Structuring your workitems using the Scrum for Team System Template

If you are using the great Scrum For Team System template inside Visual Studio 2010, there are some important things to remember. First of all, try to use the included ScrumMaster WorkBench tool. This makes managing releases, sprints, sprint backlog items, … a lot easier. If you are not using this tool, it’s important to know that the template is really peculiar on how you structure your iterations and link them to workitems. If you do this incorrectly, you end up with empty reports. To make sure that everything works correctly, structure your workitems like this: Release Work Items (Level 2) Must be associated to iteration path nodes at the level two branch depth. Sprint Work Items (Level 4) Must be association to iteration path nodes at the forth level depth. Team Sprint Work Items (Level 5) Must be association to iteration path nodes at the fifth level depth

Enterprise Library 5 Released

The long waited release of Enterprise Library 5 is available on the Codeplex site. Although the current feature list looks not that different from previous versions, a massive refactoring is done on all the Application Blocks to work with Unity and  dropping the old ObjectBuilder library. Also the configuration tool is now implemented in WPF . Another nice addition is the new  programmatic configuration support via fluent interface which make writing configuration in code more intuitive. This is especially useful if you want to get rid of the massive amount of XML configuration and start introducing some Convention over Configuration . You can download Enterprise Library 5 from here .

MSDN Live Meeting - VSTS 2010 Update

On Tuesday, April 27, I’m giving a live webcast about Managing Projects with Microsoft Visual Studio Team System 2010 . During this webcast, you will discover that combining Visual Studio Team Foundation Server with Visual Studio, you can apply proven practices to manage your application's lifecycle, from understanding customer needs through code design and implementation to deployment. You can use the instrumentation in these tools to trace requirements to checked-in code, builds and test results. More information and registration here: https://msevents.microsoft.com/CUI/WebCastEventDetails.aspx?EventID=1032447674&EventCategory=2&culture=en-US&CountryCode=US

Changing workitem types in TFS 2010

In TFS 2008 you could export and import workitem types using the witimport and witexport tool. In TFS2010 these tools are gone and replaced by the witadmin tool.

Viewing and changing Excel Reports in TFS 2010

One of the nice new features in TFS 2010 are the Excel Reports. They give you an easy to change view on top of the TFS warehouse. But you need to have the necessary rights before you can start playing with these Excel Reports. Viewing an Excel report: Member of the Team Foundation Valid Users security group. Edit an Excel report: Member of the TfsWarehouseDataReader security role in SQL. Server Analysis Services. (More info here ) Contributor permissions in SharePoint Products for the team project.

Create (less ugly) HTML documents from Word

In earlier versions of Microsoft Word, the Save as HTML command created basic HTML documents based on your Word formatting commands. Microsoft provides an HTML filter that you can download, install, and use alone or with Microsoft Word to strip Office-specific codes from HTML documents, creating much cleaner coding. Word 2002 and later incorporate this filter as an option by default; from the File menu (or Office Button menu in Word 2007), select Save As... to save your document. Then, under "Save as type:", select Web Page, Filtered .

Refresh the TFS 2010 Warehouse

By default the TFS warehouse is rebuild every hour. You can however change this default behaviour. You can either change the interval of the warehouse refreshment or you can refresh the cube manually. I knew how to do this on our TFS 2008 server, but the procedure has changed on the TFS 2010 environment. Change the interval To change the interval follow the following steps 1. Login on the Application Tier 2. Browse to http://servername:8080/tfs/TeamFoundation/Administration/v3.0/WarehouseControlService.asmx . You get a list of all available web services. 3. Click on the ChangeSetting webservice 4. You will see a new page where you can enter the setting and its new value. Enter in the SettingId: RunIntervalSeconds and in the newValue the number of seconds 5. Click on Invoke to change the setting Manually refresh the cube You can also choose to refresh the cube once. To do that execute the following steps. 1. Open the list of web services again with http://servernam

MSBuild Explorer

Browsing through your MSBuild files can be a painful experience. Last week I found the MSBuild Explorer tool that can make this job a lot easier. MSBuild Explorer provides the following main areas of functionality. Exploring MSBuild Files; explore the makeup of your MSBuild files, showing all properties, item groups, imports and targets which are colour coded to indicate whether they are Initial, Default or both. Targets are shown in a Treeview with DependsOnTargets as subnodes. Favourites; allows you to save the way you execute MSBuild files. Quick Run; allows you to quickly edit and run MSBuild scripts. New MSBuild File; creates a new file based on the template your have configured. Check it out!

Start external program

To debug a Visual Studio Addin I configured the Addin project to start a new instance of Visual Studio when I run the addin inside VS. Open the project properties Go to the Debug tab. In the Start Action part, select the Start external program option and browse to devenv.exe. After checking my code into Team Foundation Server, I noticed that another developer had to reconfigure this setting. The reason is that this setting goes into your user options file. I got bored having to re-set this value again and again. So I searched for a solution.  Turns out that this setting goes to a file named after your project file plus the ".user" extension. This file is just a fragment of an MSBuild file, and would look something like: 1: <? xml version ="1.0" encoding ="utf-8" ? > 2: < Project ToolsVersion ="4.0" xmlns ="http://schemas.microsoft.com/developer/msbuild/2003" > 3: < PropertyGroup Condition ="&

VSTO Fun

It’s always fun when you get a challenging requirement. Today I had to implement a feature to create readonly word documents. First I was looking at using the Protect Document features to implement this functionality. But a colleague found that there is a simpler option available. You can mark a document as final. Doing this using  VSTO is nothing more then setting the Document.Final property to true. However, our journey wasn’t complete yet, because once you change this property to true a dialog box is shown to the user and I couldn’t get rid of it. So in the end, I changed my implementation to use OpenXML and solved it by adding a CustomPropertiesPart to the WordDocument.

Changing the workitem list type in Excel

By default if you export a workitem query result to Excel, you always get back a flat list.  Execute following steps to change this flat list to a tree list: On the Team tab in Office Excel, in the Work Items group, click Add Tree Level. In the Convert to Tree List dialog box, click Parent-Child or other type of link (only tree link types are listed), and then click Yes. Team Foundation changes the list structure to a tree list.

TFS 2010 Work Item Link Queries

To extend our ALM offering, we are building some custom tools that interact with TFS 2010 work item tracking via the client object model I noticed one very strange thing while implementing some query features. When querying for links, we specify "FROM WorkItemLinks" in the WIQL and then use Query.RunLinkQuery() to get the results. The results consist of an array of WorkItemLinkInfo objects. These contain only the link type, and the source and target work item IDs. So specifying anything other than "SELECT [System.Id], [System.Links.LinkType]" has no effect. But if we look at the built-in link based queries in VS2010 and examined their WIQL, there are always tons of fields in the SELECT clause even though they specify "WorkItemLinks" in their FROM clause. If I run one of those queries from the client object model those fields aren’t returned. I’m  speculating that Team Explorer  does some of kind of parsing that converts the WorkItemLinks query to a c

Migrate data between Azman stores

At a couple customers we are using the Microsoft Authorization Manager (Azman). This gives us great flexibility in managing the security aspect of our applications. But one very important feature is missing: the possibility to migrate data between Azman stores.  Follow this link to learn more about AzMan . While there is no production tool meant to provide this, a sample application azmigrate.exe is available in the Windows SDK. This tool can export the AzMan store allowing migrations from AD to XML or from XML to AD. To get azmigrate.exe you must download the Windows SDK and once it is installed you need to: 1) Compile the AzMigrate project using Visual Studio: a. Select File > Open > Project/Solution. b. Change the Solutions Configurations drop-down in the toolbar from Debug to Release. (Don’t forget to do this. I got some compilation errors in Debug mode.) c. Select AzMigrate.vcproj located under %programfiles%\Microsoft SDKs\Windows\v7.0\Sam

ASP.NET MVC 2 Diagnostics

After installing the ASP.NET MVC 2 bits, I also took a look at the MVC Futures library. This library contains a bunch of features that might be included in a future version of ASP.NET MVC.  One of the things I found in there was a file called MvcDiagnostics.aspx. I guessed that it would give me some diagnostics features but I had no idea how to get this file working. Until I found this blog post where Brad Wilson explains us how this file can help us getting some diagnostic information about ASP.NET MVC: http://bradwilson.typepad.com/blog/2010/03/diagnosing-aspnet-mvc-problems.html

Uploading large files in IIS 7

If you are using IIS 7 and you try to upload a file that is too big you will receive this error: The request filtering module is configured to deny a request that exceeds the request content length. In IIS 6 you could manipulate the request length by changing the maxRequestLength in the web.config. 1: < system.web > 2: < httpRuntime maxRequestLength ="1000000" /> 3: </ system.web > To get the request length raised in IIS7, you have to set the following value in the web.config: 1: < system.webServer > 2: < security > 3: < requestFiltering > 4: < requestLimits maxAllowedContentLength ="1000000000" > 5: </ requestLimits > 6: </ requestFiltering > 7: </ security > 8: </ system.webServer >

Becoming a RESTafarian

Lately I’ve been playing around a lot with REST and the REST toolkit for WCF. One very good series of blog posts to get you started: REST in WCF – Part I (REST Overview) REST in WCF – Part II (AJAX Friendly Services, Creating The Service) REST in WCF – Part III (AJAX Friendly Services, Consuming The Service) REST in WCF – Part IV (HI-REST – Exposing a service via GET – Configuring the service) REST in WCF – Part V (HI-REST – Exposing a service via GET – The ServiceContract and Implementation) REST in WCF – Part VI (HI-REST – Consuming our GET service via AJAX) REST in WCF – Part VII (HI-REST – Implementing Insert and Update REST in WCF – Part VIII (HI-REST – Implementing Delete) REST in WCF – Part IX – Controlling the URI REST in WCF – Part X – Supporting Caching and Conditional GET REST in WCF – Part XI (Tunneling PUT through POST) Exposing ATOM feeds from your services Exposing ATOM feeds from your services – part II – customizin

Extreme Programming in a nutshell

J.D. Meier succeeds in giving a brief overview of the activities, artifacts, practices,… in Extreme Programming: http://blogs.msdn.com/jmeier/archive/2010/04/06/extreme-programming-xp-at-a-glance.aspx

NoSQL: Getting Started with MongoDB and NoRM

As the mainstream .NET world is finally finding it’s way to Object Relational Mapping, the Alt.NET guys are already looking at the next logical step: document databases. Howard van Rooijen wrote a very good introduction to help you take your first steps to a NoSQL world: http://howard.vanrooijen.co.uk/blog/2010/04/04/a-dotnet-developer-guide-to-mongodb-and-norm/ As always don’t forget; “The right tool for the right job…”

Problems configuring Team Foundation Server 2010 to use a Fully Qualified Domain Name

As transparancy is important for us, we expose our Team Foundation Server environment to our customers. Of course we use a public domain name to let our customers access the TFS environment. First I configured our DNS Server internally to ensure that the FQDN points to an internal IP Address when users are connected to the network (to avoid having to route out to the internet and back). Then for external access, I added an A record for the FQDN to the DNS zone file for our domain. Unfortunately when I opened the web access, the links  to the sharepoint portal and the report website were still pointing to the internal server name. So I opened the very nice TFS Admin Console, selected the  Application Tier, then clicked on Change URLS. I changed the Public Url field  to the FQDN and let the Server Url continue to use the machine name. Next I selected Application Tier > SharePoint Web Applications, then selected the top row in the list within the SharePoint Web Applications lis

What’s new in WPF 4?

As Microsoft is putting all it’s energy into Silverlight , little is left for it’s (little) brother WPF. Still there are some important improvements that make WPF 4 a lot more useful than the previous version. For an overview of the changes, have a look at this MSDN article: http://msdn.microsoft.com/en-us/library/bb613588(VS.100).aspx

Passing parameters to a Reporting Services Report

Today I had to call a Reporting Services Report by passing the parameters using the query string. As I was not sure if this was possible, I looked on SQL Server Books Online where I found this article confirming that it was possible to pass report parameters by including them in the url. The only thing that's not super clear about that article is that when you just navigate to the server you have set up running SSRS it probably looks something like this: http://myserver/Reports/Pages/Report.aspx?ItemPath=%2fReports%2fOrdersByCustomer If you just put the parameters on that URL it doesn't work. You have to change the URL to be like this: http://myserver/ReportServer?/Reports/OrdersByCustomer Then you can add on URL parameters like this: http://myserver/ReportServer?/Reports/OrdersByCustomer &CustomerID=ALFKI