Skip to main content

Posts

Showing posts from May, 2010

Community Day 2010

If you are not registered yet for Community Day 2010 on June 24 at Utopolis Mechelen, now is the time! It will be a great event as always where all 14(!) Microsoft User groups will be present. I will be giving following VISUG session together with my colleague Gill Cleeren : Building an enterprise application with Silverlight and NHibernate [15.45 - 16.45] See you all there!

HtmlHelpers and the Spark View Engine

As I was trying the Spark View Engine as an alternative for the WebForms View Engine inside ASP.NET MVC, I first didn’t succeed in using HtmlHelpers inside my views. The problem was easy to fix. I forgot that the HtmlHelpers are in a seperate namespace. After adding the correct namespace ‘System.Web.Mvc.Html’ to the SparkSettings my views finally compiled. 1: var settings = new SparkSettings() 2: .SetDebug( true ) 3: .AddAssembly( "SparkDemo" ) 4: .AddNamespace( "System" ) 5: .AddNamespace( "System.Collections.Generic" ) 6: .AddNamespace( "System.Linq" ) 7: .AddNamespace( "System.Web.Mvc" ) 8: .AddNamespace( "System.Web.Mvc.Html" ) ; 9:   10: ViewEngines.Engines.Add( new SparkViewFactory(settin

Parsing logfiles

For a client we are investigating a range of log files from different sources. To simplify this task we are projecting all different log formats to the same structured CSV file. We first started writing some custom log parsers, until I found this tool: Log Parser . An extract from the documentation: Log Parser 2.2 is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows operating system such as the Event Log, the Registry, the file system, and Active Directory.  You tell Log Parser what information you need and how you want it processed. The results of your query can be custom-formatted in text based output, or they can be persisted to more specialty targets like SQL, SYSLOG, or a chart.  Most software is designed to accomplish a limited number of specific tasks. Log Parser is different... the number of ways it can be used is limited only by the needs and imagina

TypeDescriptor.GetProperties() vs Type.GetProperties()

While having fun with reflection in C#, I started to wonder when I should use TypeDescriptor.GetProperties() vs Type.GetProperties(). The difference is in what they return. obj.GetType().GetProperties() returns a System.Reflection.PropertyInfo[] whereas TypeDescriptor.GetProperties() returns a PropertyDescriptorCollection , . The PropertyInfo class represents only actual properties created on the object. A PropertyDescriptor is either a custom concrete child of the PropertyDescriptor class (implemented by the type defining the custom descriptor), or is an instance of ReflectPropertyDescriptor that uses the PropertyInfo class to provide dynamic invocation of the property. So for a class that does not define a custom descriptor, you will functionally get the same objects back, though the PropertyDescriptor is abstracting away the PropertyInfo . Where is this useful? The TypeDescriptor class is used in designers, so that they can interact with the design-time environment

Resolving schema conflicts TFS

After upgrading our Team Foundation Server environment, we ended up with 2 collections each containing their own process templates, build server and so on… The concept of a Team Project Collection is a really great improvement in the TFS solution, but there is one caveat: schema conflicts can occur when a set of attributes for reportable fields differs across team project collections. When a schema conflict occurs, data that is associated with that schema cannot move into the data warehouse and the SQL Server Analysis Services data cube. You must correct all schema conflicts to unblock processing of the associated data for the warehouse and to enable the associated reports to display current data. All reportable data from all team projects that are defined in all project collections for a deployment of Visual Studio Team Foundation Server is written to a single relational data warehouse. Data from that warehouse is then processed and written to the cube. Collecting data into a si

Web Part Error: A Web Part or Web Form Control on this Page cannot be displayed or imported. The type is not registered as safe.

After upgrading our Team Foundation Server environment to 2010, we got the following error on the project dashboard web page on the TFS2010 project portal. “Web Part Error: A Web Part or Web Form Control on this Page cannot be displayed or imported. The type is not registered as safe.” I tried a lot of things (reinstalling the WSP solution, adding the controls to the web.config file,…) before I found this simple solution: After executing "Repair Connection" on all the Sharepoint Applications in the Team Foundation Server Administration Console, everything worked fine.

Scrum for Team System version 3 RTM is available

And just before the weekend begins, some good news. The Scrum for Team System version 3 RTM template is now available for download. To get your copy now click here . Check out the below links for more information on all the new features of SfTS v3: PDC Presentation Process Guidance Crispin Parkers Blog Simon Bennett's Blog

System.Data.SqlClient.SqlError: Cannot open backup device ''. Operating system error 5(Access is denied.)

To prepare a Team Foundation Server upgrade, I was creating backups of some databases. But I didn’t got far, because following error was shown in the SQL Server Management studio each time I tried to create a backup: System.Data.SqlClient.SqlError: Cannot open backup device 'd:\backups'. Operating system error 5(Access is denied.). (Microsoft.SqlServer.Smo) Even after logging in as Administrator , I got the same error.  After some investigation I found out that it doesn't matter who *you* are logged in as, it is the service account for SQL Server service that matters. So after changing the security permissions on the ‘d:\backups’ folder to give the service account (Network Service) write permissions, everything worked.

Silverlight cross-domain trouble

The last days I’m spending some time learning Silverlight. And as a newbie, I’m making all the beginner mistakes (and learning a lot from it). One of the obvious things you need to do if your Silverlight application is calling out to external webservices is to enable cross-domain calls. Most of the time you get an error message like this: “An error occurred while trying to make a request to URI ‘http://localhost:1378/MyFirstSilverlightService.svc’. This could be due to attempting to access a service in a cross-domain way without a proper cross-domain policy in place, or a policy that is unsuitable for SOAP services. You may need to contact the owner of the service to publish a ……...” To solve this you have to add a clientaccesspolicy.xml file in the root folder where your services are hosted. Insert the following lines: 1: <? xml version ="1.0" encoding ="utf-8" ? > 2: < access-policy > 3: < cross-domain-access > 4: < policy

MissingMethodException

During acceptance testing you always got the strangest errors logged, most of the time errors you can’t reproduce in your development environment. A nice one I got this week is the following: Type : System.MissingMethodException, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089 Message : Method not found: Boolean System.Threading.WaitHandle.WaitOne(Int32). It turns out that on the customer’s computer, a MissingMethodException is thrown when calling: bool signal = WaitHandle.WaitOne(0); The customer assured me they had the.NET Framework 2.0 installed. So how can this  system method not be there? Turns out, that this overload of the WaitOne method that was introduced in .NET 2.0 SP2.  However it seems that SP2 is not installed by Windows Update and isn’t even available as a separate download. It is only available as part of .NET Framework 3.5 Service Pack 1. There are two “fixes” for this problem: Supply a boolean as the second ar

MSF Agile 5.0 Tips

With the release of Visual Studio 2010 a new version of the MSF Agile process template was released.  The 5.0 version is a complete rework and contains a lot of new features and concepts to better match the current trends in the Agile software development community.  Understanding these changes can be a daunting task. Therefore these tips from Aaron Bjork to get you going. Tip #1 – Epics and Themes Tip #2 – Simple User Story Titles Tip #3 – Story Point Scales Tip #4 – Plan Using Velocity Tip #5 – Learn to Love Acceptance Criteria

NHibernate Basics

If you are looking for a good introduction about NHibernate, Bob Palmer started a series of NHibernate tutorials as the first steps in a larger effort to help update the NHibernate reference documentation. For many users, starting out with NHibernate is less of a learning curve and more of a learning cliff - so the intent with this first article series is to gradually expose the user to the features of NHibernate, from basic mapping through advanced techniques. You can find the first three tutorials at these links: Hello NHibernate! - Quickstart with NHibernate (Part 1) Implementing a Repository with NHibernate - QuickStart with NHibernate (Part 2) Mapping Object Relationships - QuickStart with NHibernate (Part 3)

Moving Team Project Collections between TFS servers

One of the great new functionalities in Team Foundation Server 2010 is the ability to take a team project collection and connect it to an entirely different TFS system. But there is a little more involved than you should think at first sight. Use this great guide by Aaron Block to help you move a team collection without trouble: http://blogs.msdn.com/ablock/archive/2010/03/19/team-project-collection-deliver.aspx

VS2010 Powercommands

Last week Microsoft released the free PowerCommands for Visual Studio 2010 extension to the online gallery.  So open up Visual Studio go to the Extension Manager,  search for PowerCommands, and just install it. The PowerCommands  adds dozens of useful commands to Visual Studio 2010. Below is a list of all the commands included in the PowerCommands for Visual Studio 2010 release: Enable/Disable PowerCommands in Options dialog This feature allows you to select which commands to enable in the Visual Studio IDE. Point to the Tools menu, then click Options. Expand the PowerCommands options, then click Commands. Check the commands you would like to enable. Note: All power commands are initially defaulted Enabled. Format document on save / Remove and Sort Usings on save The Format document on save option formats the tabs, spaces, and so on of the document being saved. It is equivalent to pointing to the Edit menu, clicking Advanced, and then clic

Using Unity Application Block – from basics to generics

If you are interested in the inversion of control principle and want to learn how to use the Unity Application block, have a look at the following blog series: Part 1 : The very basics – Begin using Unity ( code here ) Part 2 : Registering other types and resolving them ( code here ) Part 3 : Lifetime Management ( code here ) Part 4 : Constructor and Property or Setter Injection ( code here ) Part 5 : Arrays ( code here ) Part 6 : Generics ( code here )

TF237002: Cannot open the document because Microsoft Excel 2007 or later, or one of its components is not installed.

One of my colleagues got the following error each time he tried to export workitems to Excel from within the Team Explorer 2010 client: TF237002: Cannot open the document because Microsoft Excel 2007 or later, or one of its components is not installed. It took me some time to figure out what was wrong: During the installation of Office 2007 he disabled the Office .Net Programmability Support. To enable it you need to modify your installed version of Office and install this option. 1. In Add/Remove programs, locate your Office application and select it. 2. Click on the 'Change' button 3. Select 'Add or Remove features' and click 'next' 4. Select 'Choose advanced customization of applications' and click 'next' OR select something like 'Add .Net programmability support'. 5. In the tree view, expand 'Microsoft Office Excel' and make sure the .NET Programmability Support option is set to 'run from my comput

ReportViewer 2010

Between all the other big features in Visual Studio 2010 you could easily forget that also the ReportViewer controls got improved a lot. In server mode, the ReportViewer control can connect to and leverage the features of SQL Server 2008 R2 report servers , as well as still connect to previous Reporting Services server versions.  In local mode, the new Visual Studio 2010 ReportViewer supports the 2008 RDLC schema , which includes features such as tablix, richly formatted text, gauges, and enhanced charts, previously introduced with Reporting Services 2008. Furthermore, a lot of work went into support for ASP.Net AJAX, improving support for standards mode and non-IE browsers, a new Javascript API for use on the client, and many API updates to both the ASP.Net and Winforms control.  You can read more about these features on Brian’s blog . The Report Viewer runtime works with .NET 3.5 as well as .NET 4.0. You can download the stand-alone runtime redistributable here .

Enterprise Library Configuration error: The type attribute does not exist on the element name.

Last week I finally had some time to upgrade an old project from Enterprise Library 3.1 to Enterprise Library 4.0.  I got most issues solved immediately but one issue took a lot more time to solve. When running the application I got the following error: System.Configuration.ConfigurationErrorsException, System.Configuration, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a Message : The type attribute does not exist on the element name. After comparing a newly created Enterprise Library 4 config file with the existingconfig file, I found one important difference. When you are using the Enterprise Library cache manager you need to add type="Microsoft.Practices.EnterpriseLibrary.Caching.CacheManager, Microsoft.Practices.EnterpriseLibrary.Caching" attribute to all the <add/> elements in the <cacheManagers/> collection. Or maybe it’s just time to move on to Enterprise Library 5.0.

Productive Programmer needs

This is the list of things I think are vital to let a programmer do their job effectively and efficiently: Unlimited access to the Internet: As we cannot know everything, being able to search on the Internet is just too useful. Fast hardware: Do you like to just wait while your source code is compiling? Colleagues: Nothing so important as a colleague who can do a code review, help you solve an issue or can give you a second opinion about a design idea. Isolation: Although communication and collaboration are key in software development, sometimes you just need to have a few hours on your own to get something done. No instant messaging, no phone calls, no meetings, just work. And here are some more.

Move Reporting Services Reports

After moving a Team Project Collection from one TFS server to another, all my workitems, source code, … were available. But I still had to move the Reporting Services reports. As I couldn’t find a Microsoft way to solve this scenario, I looked for some custom tooling to get the job done. And I found this great Reporting Services Scripter . It allows you to move RDL file from one Reporting Services instance to another. Make sure you check out the different options because I had to change some configuration settings before everything worked.

Unable to attach a Project Collection in TFS 2010

After creating a backup from an existing instance of TFS 2010 and  restoring it in SQL Server using a different name, I started the Team Foundation Server Administration Tool. But the project collection attachment wizard did not recognize any available Team Project Collection DB. So what did I do wrong? It’s important to understand that you can not attach a collection database that has not been previously detached. This is mainly due to identity and configuration data that exists globably which need to be moved down to the collection. Without this data the collection would not know who are the users that checked in code and other scenarios. So the correct steps to attach an existing project collection are: Detach the project collection you want to move on the source TFS server Create a backup of the TFS_[CollectionName] database on the source database server. Restore the database on the target database server. (Note: Make sure that the database has a different name tha

Multiple TFS Build services on one machine

In TFS 2010, each Build Service support zero to one Build Controllers and zero to n Build Agents. Unfortunately each build service can only service one particular project collection. So if you need to provide build services for multiple Project Collections from a single build machine, it does not work out of the box. Jim Lamb describes a hack to enable this scenario: http://blogs.msdn.com/jimlamb/archive/2010/04/13/configuring-multiple-tfs-build-services-on-one-machine.aspx As this is not a supported configuration of the build service – use it at your own risk.

Office 2003 does not work with Team Explorer 2010

After installing Team Explorer 2010 the TFS Office Add-In is no longer working because TFS requires Office 2007 or newer. Your company is not rolling to Office 2010 for another six months, what do you do? Luckily there is ‘hacky’ workaround to get the 2008 TFS Office Add-Ins back up and running with Excel 2003. John Nierenberg created a post describing the steps to get there. Notice that the functionality is rather limited when using the 2008 Office Integration to connect to Team Foundation Server 2010: Excel with the 2008 Add-In only has the ability to import flat queries and not tree queries. Project with the 2008 Add-In will not round trip the hierarchy or dependency link types. If you do a repair or update to Team Explorer, you may have to go through these steps again as the installer will default to newest version of the TFS Office Add-Ins. There are currently no plans to re-enable the support for Office 2003.

How do I know that an XML element is empty?

Imagine that you have the following XML: 1: < books > 2: < book /> 3: < book ></ book > 4: < books > How do you know the difference between line 2 and 3 when parsing the XML? (And yes there is a difference. Go read the XML specifications!) The IsEmptyElement property on the XMLReader class gives you this specific functionality. 1: while (reader.Read()) { 2: if (reader.IsStartElement()) { 3: if (reader.IsEmptyElement) 4: Console.WriteLine( "<{0}/>" , reader.Name); 5: } 6: } 7: }

Yet another methodology

If you still haven’t found a methodology you like, maybe have a look at Kanbanand . It’s introduced by John Sonmez who created a tasty mix of lean, scrum, extreme programming and some extra salt and pepper. Together this gives a nice list of process rules, development practices, infrastructure requirements and a lot of common sense. I can only agree that taking the things that work for you are always better than religiously following a methodology. So I hope to read about your methodology soon…

Error running install with custom arguments

For an ASP.NET MVC application I created a web setup was needed to automate the installation procedure. One of the steps during the installation is a custom dialog which allows you to specify the database connection information. So I created a class library added an installer class, wrote the code to show the custom form and added the DLL as a custom action to the websetup project. To pass on the installation location to the custom action I use the [TARGETDIR] argument. But no matter what I tried, I always got a filenotfoundexception. After some investigation I found my answer in the MSDN documentation : For Windows Installer properties such as [TARGETDIR] that return a directory, in addition to the brackets you must include quotation marks and a trailing backslash: /name="[TARGETDIR]\". After adding the trailing backslash \ to the [TARGETDIR] property everything works. I’m curious about the reason why they require this…

IntelliTrace iTrace Files

One really great  feature in Visual Studio Team System 2010 is IntelliTrace .  IntelliTrace captures the current state of the debugger at multiple points during a program’s execution and, when F5 debugging, allows you to debug back in time to previous debug states in your program. This in and of itself is a very handy feature, but in this day and age it’s often hard to have a bug with an easy and consistent repro that you can debug on a local dev box. The solution to this lack of a local repro is that not only does IntelliTrace enhance your local debugging experience, but it also saves all the collected debugger data points into a trace log file (.itrace extension) that can then be opened and debugged using Visual Studio later and on a different machine. An  iTrace files allows  a developer to debug in and around the point of failure after the fact. Ian Huff wrote a very interesting article explaining every little detail about these iTrace files. Check it out !

Moving to TFS 2010 ebook

Microsoft released some free chapters of the new Moving to TFS 2010 book. This book will help professional developers move from previous versions of Visual Studio. It will cover the features of Visual Studio 2010 through an application. It will go through a lot of the exciting new language features and new versions of the most popular technologies without putting the emphasis on the technologies themselves. It will instead put the emphasis on how you would get to those new tools and features from Visual Studio 2010. Download some free chapters: http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=560a5365-5c62-488a-91ed-a779e0e33ac4

Guidance Automation Extensions and Guidance Automation Toolkit 2010

The Guidance Automation Extensions + Guidance Automation Toolkit 2010 were released last week. The Guidance Automation Extensions (GAX) 2010 enables Visual Studio 2010 to run guidance packages, such as those included in Software Factories or third-party tools. You can use the Guidance Automation Toolkit (GAT) 2010 to create or customize guidance packages. Download Guidance Automation Extensions (GAX) 2010 The Guidance Automation Toolkit (GAT) 2010 is an extension to Visual Studio 2010 that allows developers to create rich, integrated developer environments that incorporate reusable assets such as frameworks, components, and patterns. The resulting guidance packages are composed of templates, wizards, and recipes that help developers build solutions in a way that is consistent with the architecture guidance. Download Guidance Automation Toolkit (GAT) 2010

Validating your architecture during a build

One of the new features in TFS 2010 is the Layer Diagram. This allows you to draw a graph with the different layers inside your application and define their dependencies. Afterwards you can  link source elements(projects, files,…) to it. What makes this feature really nice, is that you can let Visual Studio validate if the dependencies as defined in the layer diagram are followed by your code. To enable this feature on your build server, right click on your build definition,  select Edit Build Definition. Go to the Process tab and add the following MSBuild argument to the MS Build Arguments parameter in the Advanced node: 1: /p:ValidateArchitecture=true More information here: http://blogs.msdn.com/ukvsts/archive/2010/03/11/validating-your-architecture-during-a-build.aspx .

TFS 2010 Power Tools Released

After the release of TFS 2010, the TFS 2010 Power Tools couldn’t wait to follow. So we’ve got a new release of: TFS Power Tools April 2010 release: http://visualstudiogallery.msdn.microsoft.com/en-us/3e8c9b68-6e39-4577-b9b7-78489b5cb1da TFS MSSCCI Provider 2010 release: http://visualstudiogallery.msdn.microsoft.com/en-us/bce06506-be38-47a1-9f29-d3937d3d88d6 TFS Build Extension Power Tool April 2010 release: http://visualstudiogallery.msdn.microsoft.com/en-us/2d7c8577-54b8-47ce-82a5-8649f579dcb6 A few features are extended but the team didn’t add any really big new stuff.  An overview: Process Template Editor (PTE) The Process Template Editor now support all of the new 2010 features like link types, etc.  They also added GUI support for a few TFS process features that have been there all along.  Support for defining link types Support for work item type categories Support for query folders Support for new work item form controls: label, link labels,

Searching through source code in TFS

Although all source code is stored in a SQL Server database, there is no tooling out of the box to search through all this code. So in search for some tooling to find specific information in our TFS source repository, I stumbled on this tool: TFSSearchCode TfsSearchCode search a string across text files under Team Foundation Server Version Control. Enter a string to Search, the node of the three and search under like pattern of file names on search. It's developed under VB.NET language and Microsoft .NET Framework version 3.5 The tool is rather buggy but if the amount of source code is limited, it seems to do it’s job. As I was not satisfied with this solution I continued my search. After another hour of googling I found out that using a search index tool(Lucene, Sharepoint Search,…) was the best option. As SharePoint doesn’t have an intrinsic protocol handler readily available for TFS you have to resort to using SharePoint’s file share mechanism to index the s

Change the regional settings of the Sharepoint site in a TFS Process Template

Last week I had an interesting customer request. For every Team Project SharePoint site they created in Team Foundation Server, the time zone information was wrong. Setting the "Default Time Zone" in the "Virtual Server Default Settings" for the "Default Web Site" makes no difference. I found out that TFS places two global templates in SharePoint, which are referenced from the Agile and CMMI process templates. These two global templates include both the Locale and Time Zone settings to be used for newly create TFS SharePoint sites. So, how can we solve this? The first step is creating a new template with the correct regional settings. Therefore we have to create a new SharePoint site based on the existing templates and then change the regional settings. This site can then be saved as new global templates in SharePoint which can then be used by the TFS process templates. Use IE to navigate to the default web site on the TFS application tier Create a new su