Friday, October 29, 2010

Could not create a new TFS project

Last week I had to create a new Team Project in Team Foundation Server. Easy job of course, until the project creation failed while adding a new Sharepoint site. In the error log, I found the following information:

The transaction log for database 'WSS_Content' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases" Query text (if available): "{?=call proc_CreateWeb(?,?,?,?,?,?,?,?,?,?,?,?,?,?)}

Aha, that’s a meaningful error message. I just had to shrink the transaction logs. I looked at one of my previous posts( Truncate your SQL Server Transaction Logs) and tried to execute the following statement inside SQL Server Management Studio:

   1:  USE WSS_Content_Log
   2:  GO     
   3:   
   4:  BACKUP LOG 'WSS_Content_Log' WITH TRUNCATE_ONLY
   5:   
   6:  DBCC SHRINKFILE('WSS_Content_Log', 1)   


But strangely enough this command failed. Instead the following error message was returned:


‘truncate_only' is not a recognized BACKUP option.


I found out that this is no longer a supported command in SQL Server 2008. I had to remove the log chain by switching to simple recovery mode. I completed the job by executing this last command:

   1:  USE WSS_Content_Log
   2:  GO     
   3:   
   4:  DBCC SHRINKFILE('WSS_Content_Log', 1)   

Thursday, October 28, 2010

10 reasons to do CQRS

Rinat Abdullin posted a PDF that maps 10 reasons to use Command-Query Responsibility Segregation in your development. These reasons are about benefits of CQRS and things that it enables to do: Domain-Driven Design, Event Sourcing, Cloud Computing etc.

10 Reasons to do CQRS

Download PDF | Permalink

Wednesday, October 27, 2010

Scrum for Team System Template and Sharepoint problems

After upgrading a TFS from 2008 to 2010 at one of my clients, the default.aspx in the SharePoint portal for every TFS site that was using the Scrum for Team System template failed to load. Instead I got a 404 error. I noticed that the problem only occurred for the earliest projects which were created using the 1.2 version of the template. Projects created with the 2.x version kept working.


So I took a look in the log at: C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\LOGS. Between all the information I found the following error message:

Cannot get ghost document: 1033\SCRUM\default.aspx
Failed to find <ListTemplate> tag corresponding to ID "104", tried both onet.xml for site definition ID "10719" language "1033" and global site definition. Operation failed.
Failed to determine the setup path of the list schema for list template 104.

The issue shows itself existing SFTS 1.2 projects showing a 404 error when navigating to the root of the Team Portal. This has been caused by the removal of files required for the WSS template. I found the following blog that describes the steps to fix the problem: http://consultingblogs.emc.com/sfts/archive/2008/04/28/sharepoint-fix-for-scrum-for-team-system-2-1-upgrades.aspx

However after executing the steps mentioned in the blog post, the sites still failed to load. I restored the backup files again and tried another solution. I browsed directly to the configuration pages by adding ‘_layouts/settings.aspx’ to the sites url. There I choose to reset the style by clicking the Reset to site definition option. Afterwards I was finally able to load the sites(although the layout was really messed up).

Tuesday, October 26, 2010

Project Sydney: The mystery will be revealed soon

One of the questions we get a lot from customers is how they can connect their Azure application to a database on premise(for legal considerations,…).  A logical scenario, but not an easy one to answer. One possible answer on this question is Project Sydney, announced by Microsoft last year at PDC 2009. It would be a technology that enables customers to connect securely their on-premises and cloud servers. Some of the underlying technologies that are enabling it include IPSec, IPV6 and Microsoft’s Geneva federated-identity capability. It could be used for a variety of applications, such as allowing developers to fail over cloud apps to on-premises servers or to run an app that is structured to run on both on-premises and cloud servers, for example.

Unfortunately it became really quiet after this first announcement. But now as PDC 2010 is approaching, I hope they’ll finally lift the curtain.

It’s an exciting time for cloud developers…

NHibernate Day Videos

If you couldn’t attend the European NHibernate Day, all the sessions are recorded and all the presentations are available for download linked from the agenda on the NHDay site.

There were 2 tracks: the main track totally on NHibernate and the second track whose topics have been selected by the attendees.

Watch the videos here:

  • Main Room, with Ayende, Gianmaria Ricci e Alessandro Giorgetti
  • Small Room, with the other speakers

Monday, October 25, 2010

Enable Excel 2007 integration with TFS 2010

If you installed Office after installing Team Explorer 2010, it’s possible that the Excel button in Team Explorer and the Team Add-in in Excel are disabled.

To (re-)enable this, execute the following steps:

  1. Open Excel 2007.
  2. Click the Office button in the left corner. A menu is loaded.
  3. Click the Excel Options  button at the bottom of the menu.
  4. Select the Add-Ins tab on the left. A list of available add-ins is shown.
  5. At the bottom select COM Add-ins from the dropdown and press Go.
  6. Check the Team Foundation Add-in and click OK. The Team bar should be available.

Remark: TFS 2010 requires Office 2007 or higher to enable the Office integration. However it’s still possible to use Office 2003 by using the Team Explorer 2008 and Forward compatibily pack to connect to TFS 2010(although you’ll loose some functionality).

Sunday, October 24, 2010

TFS 2010 Management Pack

We had to wait a long time but now it’s finally there, the TFS 2010 Management Pack:

http://www.microsoft.com/downloads/en/details.aspx?FamilyID=97ca3b31-3653-4d60-bdad-3f2017febdc3&displaylang=en

It provides both proactive and reactive monitoring of Microsoft Team Foundation Server 2010. It monitors TFS components such as application tier server instances, team project collections, build servers, and proxy servers. It is a substantial upgrade from the TFS 2008 Management Pack.

Feature Summary
The monitoring provided by this management pack includes availability and configuration monitoring, performance data collection, and default thresholds. You can integrate the monitoring of Team Foundation Server components into your service-oriented monitoring scenarios.

  • Auto discovery of TFS components
  • Implements containment hierarchy, reflecting logical architecture of the Product
  • Implements a proper health model using Monitors
  • Contains tasks, diagnostic and recovery for certain failures
  • Provides events that indicate service outages
  • Provides alerts that show configuration issues and connected data source changes
  • Verification that all dependent services are running
  • Triggers targeted running of BPA against TFS Servers from Operator Console

Note: This management pack will only work with TFS 2010 servers.  If you want to monitor previous versions of TFS, you will need to continue to use the older version on the management pack:

http://www.microsoft.com/downloads/en/details.aspx?FamilyID=28c745b5-28cc-474a-a5fd-944c246d7727&displaylang=en.

Saturday, October 23, 2010

WCF service not available

Last week, some of my integration tests suddenly start failing on our build environment. I noticed that these tests were using a WCF service, so I logged on to the web server and checked the event viewer. There I found the following error message:

It is not possible to run two different versions of ASP.NET in the same IIS process. Please use the IIS Administration Tool to reconfigure your server to run the application in a separate process.

What happened? Someone installed a newer version of this WCF service which was using .NET Framework 4.0 instead of 3.5. But he forgot to update the application pool to run under ASP.NET 4.0 with the error above as a result.

Solving this was easy, I created a new application pool running under ASP.NET 4.0 and configured the web service to run under this process.

Friday, October 22, 2010

Test Results not visible in TFS 2010 Build Report

Last week I had a problem at a client where he ran a TFS build  with tests and code coverage, but in the build summary report it only showed No Test Results and No Code Coverage Results. The strange thing is when I logged in and view the build, I could see the test results…

As you can guess, this problem is related to some missing security permissions. To be able to see test results you must have the View Test Runs permission, which can be assigned/revoked at the security group level per team project.

So setting this permission for the Contributors solved the problem.

Thursday, October 21, 2010

Vingy, a search plug-in for Visual Studio

The first thing that most developer do when they have a problem, is google it. But then you are overwhelmed with the amount of results. So if you want less quantity and more quantity, you start searching on sites like StackOverflow, CodeProject, MSDN etc.

With Vingy – a simple, but effective add in for Visual Studio 2010 – you can search on all these site without leaving your Visual Studio environment.

Installing Vingy

Vingy is free and you can install it from this direct link  or via the Extensions Manager in Visual Studio 2010 (Search Vingy in the Online Gallery). Enjoy, happy coding!!

Let me know your feedback. This version is minimal, but I’ll add more meat based on your feedback. Also, planning to make it open source after a bit of code cleanup.

Using Vingy

You can bring up Vingy either by clicking View->Other Windows –> Vingy Search Window from the Visual Studio IDE, or just by highlighting some text in the document and then clicking Tools –> Search Selected Text (Ctrl + 1).

Wednesday, October 20, 2010

New Types and Members in the .NET 4 Framework

If you want to be absolutely sure that you didn’t miss a new type, property or method inside the .NET 4.0 framework, have a look at the following links. They provide a comprehensive list of the new namespaces, types, and members in the .NET Framework version 4 and provide links to the corresponding reference topics.

Tuesday, October 19, 2010

Create a load test result database

One of the nice features of Visual Studio 2010 Ultimate is the load testing. This allows you to schedule a set of automated test scenario’s using multiple criteria like number of users, distribution, network latency,… Results of these load tests are stored in a SQL database.

Before you can save the load test results to the database, a database schema must be created,

To create the load test database:
  1. Open a Visual Studio Command prompt. Browse to the following location: cd c:\Program Files\Microsoft Visual Studio 10\Common7\IDE
  2. In that folder, type the following command:

    SQLCMD /S localhost\sqlexpress /i loadtestresultsrepository.sql

Remark:The parameters are case sensitive. You must type uppercase S and lowercase i.

Monday, October 18, 2010

Hosting a .NET 4.0 REST service in IIS 6

Last week I had to deploy a WCF 4.0 REST service on an IIS 6 environment. When I ran this service locally everything worked perfect. In the route table I had configured that my service was accessible at http://localhost/servicename/.

But when i tried to call the service on the server environment, I got nothing back. I knew that IIS by default expects that a request can be mapped to a physical file on disk. So what I could have done was adding an SVC file. But I didn’t need an SVC file on my local machine. I wanted a better solution.

It took me some time but with these steps I got everything working as expected:

  1. Open the IIS  manager. Right click on the VirtualDirectoryName where the service is installed and select Properties. On the Virtual Directory Tab click the Configuration Button. Insert a new WildCard Mapping C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\aspnet_isapi.dll. Don’t forget to uncheck Verify that file exists.
  2. Go back to the Properties and open Directory Security Tab. In the Authentication and access control click the Edit Button. Uncheck Integrated Windows Authentication.
  3. Reset IIS(really important!).

Sunday, October 17, 2010

Create a backup of your SQL Azure database

One of the questions I get a lot from customers is how they can backup their SQL Azure database. You cannot just open SQL Server Management studio, create a maintenance plan and schedule a backup every night.

So what do we have to do instead?

Let me first mention that in Azure your data is always stored on multiple instances on different independent hardware in the data center. At any one time,there are three replicas of data running – one primary replica and two secondary replicas. At the same time Microsoft performs periodic offsite backups of the data in case of a catastrophic failure at the data center. This will already limit the risk of loosing your data in case of a hardware failure. But of course a user can always make a mistake and accidently remove some important data.

So even on SQL Azure a good backup strategy is vital.

To perform a backup in SQL Azure a transactional mechanism is used that copies the data without any downtime to the source database. The database is copied in full to a new database in the same datacenter. You can choose to copy to a different server (in the same data center) or the same server with a different database name.

A new database created from the copy process is what they call ‘transactionally consistent’ with the source database at the point in time when the copy completes. This means that the snapshot time is the end time of the copy, not the start time of the copy. This is important to keep in mind when you plan your backup schedule.

Create a backup

A backup can be created by executing the following T-SQL:

   1:  CREATE DATABASE destination_database_name
   2:  AS COPY OF [source_server_name.]source_database_name

This command must be execute when connected to the master database of the destination SQL Azure server.

Monitoring the backup process

You can monitor the currently copying database by querying a new dynamic managed view called sys.dm_database_copies.

   1:  SELECT * FROM sys.dm_database_copies

Permissions Required

Note: When you copy a database to a different SQL Azure server, the exact same login/password executing the command must exist on the source server and destination server. The login must have db_owner permissions on the source server and dbmanager on the destination server.

Saturday, October 16, 2010

Porting an existing application to SQL Azure

You finally finished your ASP.NET MVC application, but now you want to bring it the cloud. How do you get started?

In this blog post by Steve Yi, he ports the MVC Music Store sample to Azure.  You’ll see that getting your (web) application running in the cloud is simple and straightforward.

Friday, October 15, 2010

Debugging a WCF Data Service

Creating a WCF Data Service is really easy thanks to the built-in item template. Unfortunately when something goes wrong, a useless
“The server encountered an error processing the request” message is returned.
WCFDataServiceRequestError
There are 2 important settings that help you to return useful error messages.
1. Add a WCF serviceDebug behavior with the IncludeExceptionDetailInFaults property set to true.
This is especially useful when something failes during the WCF request pipeline and the WCF Data Service couldn’t be initialized.
   1:  <system.serviceModel>
   2:  <behaviors>
   3:  <serviceBehaviors>
   4:  <behavior>
   5:  <!-- To receive exception details in faults for debugging purposes, set the value below to true.  Set to false before deployment to avoid disclosing exception information -->
   6:  <serviceDebug includeExceptionDetailInFaults="true"/>
   7:  </behavior>
   8:  </serviceBehaviors>
   9:  </behaviors>
  10:  </system.serviceModel>

2. Set UseVerboseErrors to true in the ServiceConfiguration
   1:  public class WcfDataService2 : DataService<DataContext>
   2:      {
   3:  // This method is called only once to initialize service-wide policies.
   4:  public static void InitializeService(IDataServiceConfiguration config)
   5:          {
   6:              config.UseVerboseErrors = true;
   7:              config.SetEntitySetAccessRule("*", EntitySetRights.All);
   8:          }
   9:  }

Thursday, October 14, 2010

SisoDb

In my journey through NoSQL(Not Only SQL) land I recently discovered the Simple Structure Oriented Db(SisoDb) project .  It follows a completely different approach then most NoSQL database implementations by building on top of your existing SQL Server infrastructure. I’m still sceptical about the performance impact but I have to agree that re-using all the great infrastructure that SQL-server provides like security, tables for the DBA’s, replication, scheduler etc, sounds very tempting.

How does it work?

SisoDb stores your POCO-graphs  using JSON which enables us to go from a POCO to persistable JSON. For each entity there will be two tables (tables are created on the fly). One holds the Id of the entity and the Json-representation. The second table holds a key-value representation of each indexed property of the entity. If an entity contains complex types the scalar properties of these custom classes will be stored and indexed to.

sisodb-sql-tables 

The project is hosted here: http://code.google.com/p/sisodb/

Remark: Note that the project is still in the prototype phase. When it’s further evolved I’ll probably post some performance tests.

Wednesday, October 13, 2010

TfsRedirect.aspx

If you are using Team Foundation Server together with it’s Sharepoint and Reporting Services integration, you probably have noticed that it’s using a special aspx page TfsRedirect.aspx to link to other resources.

So what is this TfsRedirect and what does it do?

When you  create a  TFS dashboard site in SharePoint, there are several items on this dashboard that point to other locations that are potentially on other servers:

  • Team Web Access
  • Process Guidance
  • Reports shown on the dashboard pages

TFS itself know where to find these assets. And the locations can change if, for example, the TFS administrator moves the reports to a different server. Rather than hard-code the locations of these assets into the dashboard, the TFS team created a web page called TfsRedirect.aspx that knows where these different assets are located, and will redirect to that page.

This is what the URL looks like on the Agile dashboard that shows the Burndown graph:

/sites/DefaultCollection/wss/_layouts/TfsRedirect.aspx?tf:Type=Report&tf:ReportName=Dashboards/Burndown&tf:ShowToolbar=0&Width=381pt&Height=180pt

Not easy to understand as you see. Luckily I found this post that explains you how this TfsRedirect works and how you can use it.

Tuesday, October 12, 2010

NHibernate 3.0 Cookbook

nhibernateNHibernate is a really great ORM solution, but there isn’t a lot of information out there. As far as I know, they were only 2 NHibernate books published. First you had the NHibernate in Action book from Manning and recently you had the NHibernate beginners guide from Packt. Both are really useful books and a must read if you start using NHibernate.

So I was really happy last week to read that a new book about NHibernate has been published by www.packtpub.com.  The book is called “NHibernate 3.0 Cookbook” and as it covers NHibernate 3.0 it is really up to date.

Overview of NHibernate 3.0 Cookbook

  • Master the full range of NHibernate features
  • Reduce hours of application development time and get better application architecture and performance
  • Create, maintain, and update your database structure automatically with the help of NHibernate
  • Written and tested for NHibernate 3.0 with input from the development team distilled in to easily accessible concepts and examples
  • Part of Packt's Cookbook series: each recipe is a carefully organized sequence of instructions to complete the task as efficiently as possible

To make this even better, Packt are offering all members of the NHibernate community 20% off the book. To get your exclusive 20% discount when you buy through PacktPub.com, just enter the discount code NHIBCBK20 (case sensitive), to the shopping cart.

Click here to read more about the NHibernate 3.0 Cookbook.

This will be the next one on my reading list…

Monday, October 11, 2010

New release of the Windows Azure Platform Training Kit

Microsoft updated its Windows Azure Training kit with the latest information. If you don’t know the Azure training kit, it includes a comprehensive set of technical content including hands-on labs, presentations, and demos that are designed to help you learn how to use the Windows Azure platform including: Windows Azure, SQL Azure and the Windows Azure AppFabric. This release includes several updates. windows_azure

Here is what is new in the training kit:

  • Updated all hands-on labs and demo scripts for Visual Studio 2010, the .NET Framework 4, and the Windows Azure Tools for Visual Studio version 1.2 release
  • Added a new hands-on lab titled "Introduction to the AppFabric Access Control Service (September 2010 Labs Release)"
  • Added a new hands-on lab "Debugging Applications in Windows Azure"
  • Added a new hands-on lab "Asynchronous Workload Handling"
  • Added a new exercise to the "Deploying Applications in Windows Azure" hands-on lab to show how to use the new tools to directly deploy from Visual Studio 2010.
  • Added a new exercise to the "Introduction to the AppFabric Service Bus" hands-on lab to show how to connect a WCF Service in IIS 7.5 to the Service Bus
  • Updated the "Introduction to AppFabric Service Bus" hands-on lab based on feedback and split the lab into 2 parts
  • All of the presentations have also been updated and refactored to provide content for a 3 day training workshop.
  • Updated the training kit navigation pages to include a 3 day agenda, a change log, and an improved setup process for hands-on labs.

Link: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=413E88F8-5966-4A83-B309-53B7B77EDF78&displaylang=en

Sunday, October 10, 2010

Comparing Scrum templates for Team Foundation Server

One of the important concepts in Team Foundation Server are Process Templates. Process Templates allows you to easily customize and adapt Team Foundation Server to integrate a development/project management methodology of your choice. Out-of-the-box you have 2 options available:

  1. MSF for Agile Software Development v5.0

  2. MSF for CMMI Process Improvement v5.0

Recently Microsoft added a third template focussing on Scrum.

Of course they are also a lot of other (free) templates available. But which one is best suited to your needs?

Crispin Parker (one of the guys working on the Scrum for Team System template), created a feature comparison between the Microsoft templates and the Scrum for Team System template. So if you are using Scrum and want to integrate with TFS, check out this post.

NuPack – Ruby Gems for .NET

If you know Ruby, one of the great tools around is Ruby Gems. Ruby Gems is a package management system that makes the process of incorporating third party libraries into your solutions as simple as possible. If you compare this to the .NET experience where you had to search for the correct assemblies and reference them yourself, the Ruby world looked a lot nicer. Until last week when Microsoft released NuPack, a package manager for .NET.

NuPack is a free open source package manager that makes it easy for you to find, install, and use .NET libraries in your projects. It works with all .NET project types. NuPack enables developers who maintain open source projects (for example, projects like Moq, NHibernate, Ninject, StructureMap, NUnit, Windsor, RhinoMocks, Elmah, etc) to package up their libraries and register them with an online gallery/catalog that is searchable.  The client-side NuPack tools – which include full Visual Studio integration – make it trivial for any .NET developer who wants to use one of these libraries to easily find and install it within the project they are working on.

NuPack handles dependency management between libraries (for example: library1 depends on library2). It also makes it easy to update (and optionally remove) libraries from your projects later. It supports updating web.config files (if a package needs configuration settings). It also allows packages to add PowerShell scripts to a project (for example: scaffold commands). Importantly, NuPack is transparent and clean – and does not install anything at the system level. Instead it is focused on making it easy to manage libraries you use with your projects.

Remark: The NuPack extensions only work with Visual Studio 2010

How to get started?

Download the latest build. Once you download it double click on the NuPack.Tools.vsix file and click Install to add the extension to Visual Studio 2010. Afterwards restart any Visual Studio instances you have running.

NuPackInstall

After installation, you’ll find a new Package Manager window under View –> Other windows –> Package Manager Console. This console uses PowerShell cmdlets to access the same features that the GUI does. The specific commands can be found here. The package manager console is the quickest way to update your solution with third-party packages.

PackageManagerConsole

Of course, you also have some nice Visual Studio integration. Right click on the References folder in a project and you’ll see this:

AddPackageReference

Adding a complex package

Let’s test the power of the NuPack engine by adding a complex package like NHibernate. From the Package Manager Console start typing add-package nh<tab> and you’ll see a list of all the available packages. Select NHibernate.Core and press <enter>.

addnhibernatepackage

The console reports us:

PM> add-package NHibernate.Core
'log4net (= 1.2.1)' not referenced. Retrieving dependency...
Done
'Iesi.Collections (= 1.0)' not referenced. Retrieving dependency...
Done
'Antlr (>= 3.0)' not referenced. Retrieving dependency...
Done
Successfully added 'log4net 1.2.1' to WpfApplication1
Successfully added 'Iesi.Collections 1.0' to WpfApplication1
Successfully added 'Antlr 3.1.1' to WpfApplication1
Successfully added 'NHibernate.Core 2.1.2' to WpfApplication1

What happened? The package manager checked for all the required dependencies, validated for each dependency if it’s already included in the project and adds a reference when it’s missing. If we look at the references for our project, we see that the required references are added.

NHibernateReferences 

On the file system a new packages folder is created which contains all the files and a  nupkg file for each depencency. This nupkg file contains all the metadata and the binary file itself.

packagefoldercontent

For more information, have a look at the Get started page.

Saturday, October 9, 2010

Windows Azure Application Monitoring Management Pack

Microsoft announced last week the release of the Windows Azure Management Pack. It enables you to monitor the availability and performance of applications that are running on Windows Azure.

A list of the available featureswindows_azure

  • Discovers Windows Azure applications.
  • Provides status of each role instance.
  • Collects and monitors performance information.
  • Collects and monitors Windows events.
  • Collects and monitors the .NET Framework trace messages from each role instance.
  • Grooms performance, event, and the .NET Framework trace data from Windows Azure storage account.
  • Changes the number of role instances via a task.

You can download the Windows Azure Application Monitoring Management Pack Here

.NET Framework libraries only show structure but no code in Reflector

Last week I opened some .Net 4.0 assemblies in Reflector but instead of showing me the actual disassembled code, it showed me an empty implementation.

EmptyCodeReflector

I couldn’t understand why it wasn’t working  until I realized that I was trying to reflect reference assemblies  located under C:\Program Files\Reference Assemblies\Microsoft\Framework. These are metadata only files used by Visual Studio 2010 to support multi-targeting.

To fix the problem you need to load the real assemblies from c:\windows\microsoft.net\framework\v4.x.x\ instead.

FilledCodeReflector

Friday, October 8, 2010

Operation could destabilize the runtime

This Friday, I had the most scariest exception ever:

"System.Security.VerificationException: Operation could destabilize the runtime.”

Sounds like all hell could break loose. How did I got this exception?

I was trying the Entity Framework integration for IBM DB2 and I was trying to load an entity which had a timestamp column in the database. The issue was caused by the Entity Framework query translator which couldn’t understand how to map this database type could be mapped to a datetime in code.

A more meaningful exception message would have been nice. It took me a lot of time to figure out the reason for this error.

Thursday, October 7, 2010

I hope some managers will read this…

Jared Richardson published an article about a topic that I recognize too well: You’re a bad manager, embrace it.

And if you’re reading Daily Dilbert, Dilbert’s manager is the best example.

VM Prep Tool for Visual Studio Lab Management


Before you can start creating virtual test environments in Visual Studio Lab management you need virtual machines (VMs) or templates to create these virtual environments. To enable testing, build-deploy-test workflow and network isolation capabilities on these environments, you need to install Visual Studio Test Agent, Visual Studio Lab Agent and Team Foundation Build Agent on VMs and templates that are part of the lab environments.
Microsoft released a tool that automates this process of upgrading/installing Test, Build, and Lab Agents on a Virtual machine and configuring them. It also provides support for automating the process of creating templates with agents installed on them.

Before you can start using it, you’ll have to execute some steps:

Afterwards you can log in on the Virtual Machine you want to configure, browse to the file share and start the VMPreptoolUI.exe.

Wednesday, October 6, 2010

ClickOnce 3.5 and .NET 4.0 problems

Last week we discovered some strange behavior in the ClickOnce implementation for Visual Studio 2010.

We created a clickonce setup from the Visual Studio 2010 command line by using

msbuild /target:publish

This created a setup file, an application manifest and all the required files as you should expect.  Important to notice is that it’s a Visual Studio 2010 solution but we are compiling against the .NET 3.5 framework(Target=.NET 3.5).

If we try to install this ClickOnce application on a system that does not have .NET 4.0 installed everything works.

However when we try to install the application on a .NET 4.0 system, the installation blows up with following error:

* Activation of \\dev\test.application resulted in exception. Following failure messages were detected:

                        + Deployment manifest is not semantically valid.

                        + Deployment manifest is missing <compatibleFrameworks>.

This error is telling us that the manifest file is not correct as the <compatibleFrameworks> section is missing in the XML file. This section is a required section for a .NET 4.0 manifest but NOT for a .NET 3.5 manifest. As this application is a .NET 3.5 application, he shouldn’t complain.

It seems like a .NET 4.0 system incorrectly assumes that it has to install the application by using the .NET 4.0 clickonce system.

Strange thing is that if we create the build by using the Publish properties inside the Visual Studio 2010 IDE, everything works. This is especially annoying because we use the same functionality on our build server to automatically create our application packages.

Tuesday, October 5, 2010

Report Builder Portal

One of the not-so-well-known tools under the SQL Server umbrella  is the SQL Server Reporting Services Report Builder. The Report Builder is a report authoring environment for business users who prefer to work in an environment similar to Microsoft Office.

The tool is a really powerful and a welcome addition to the Access reporting and Excel. Which developer doesn’t like a tool that let the end user create those boring reports himself so he can focus on writing some ‘real’ code?!
Find all the information on the Report Builder portal. One demo will be enough to convince your end users.
By the way, did I mention that it’s a free tool available for download here?

Monday, October 4, 2010

How to explain the benefits of Agile to a CFO?

As a strong believer in Agile methodologies, I had a lot of discussions to convince the non-believers. When you talk to a CIO, CEO, CFO, you need of course a different approach and different arguments than when you are talking to a developer or project manager.

I found this article on InfoQ full of tips, ideas and arguments to convince the CFO(Chief Financial Officer, read as: the guy with the deep pockets).

A must read if you want to discuss Agile in financial terms.

DailyDilbertAgile

Sunday, October 3, 2010

Tracing TFS data

As I spend a lot of time customizing and extending Team Foundation Server, tracing is very useful for me to view what is sent between the TFS client and server. 

How can you enable this?

By using the .NET diagnostics and trace listeners of course! To enable this we’ll have to tweak our Visual Studio configuration a little bit:

  1. Shut down Visual Studio
  2. Browse to your Visual Studio 2010 installation directory. Most of the time this is something like C:\Program Files\Microsoft Visual Studio 10.0
  3. Go to the \Common7\IDE folder
  4. Open devenv.exe.config and add the following diagnostics configuration:
       1:  <system.diagnostics> 
       2:    <switches>
       3:      <add name="TeamFoundationSoapProxy" value="4" /> 
       4:      <add name="VersionControl" value="4" /> 
       5:    </switches> 
       6:    <trace autoflush="true" indentsize="3"> 
       7:      <listeners> 
       8:        <add name="myListener" 
       9:     type="Microsoft.TeamFoundation.TeamFoundationTextWriterTraceListener,Microsoft.TeamFoundation.Common, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" 
      10:         initializeData="c:\logging\tfs.log" /> 
      11:        <add name="perfListener" type="Microsoft.TeamFoundation.Client.PerfTraceListener,Microsoft.TeamFoundation.Client, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" 
      12:                 /> 
      13:      </listeners> 
      14:    </trace> 
      15:  </system.diagnostics>
  5. Save the file
  6. Start Visual Studio again


A lot of data will be logged, so you will see a decrease in performance. Therefore restore your configuration file to its original state after debugging.

Saturday, October 2, 2010

Icon Extractor

Seen a nice icon that you want to re-use in your own application? (don’t forget the copyright!)

I found IcoFX, a small and easy to use tool that allows you to create, extract and edit icons. It works with Windows XP, Windows Vista, Windows 7 and Macintosh and best of all it’s free!

Screenshots

Download it here: http://icofx.ro/

Team Foundation Installation Guide for Visual Studio 2010 Update

Microsoft released an updated version of the Team Foundation Installation Guide. So download and use this guide when installing or upgrading Team Foundation Server or any of its components.

Download the guide here: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=2d531219-2c39-4c69-88ef-f5ae6ac18c9f

Friday, October 1, 2010

Don’t panic, ASP.NET security issue is fixed

After a stressful week, with workaround deployments, daily checking the IIS logs and a lot of praying, the ASP.NET Security issue has been patched. With the speed of light Scott Guthrie and his team provided a patch on the Microsoft Download Center.

Yesterday Microsoft also made it possible to update systems through Windows Update (WU) and Windows Server Update Services (WSUS).  This makes installing this patch a lot easier.

So before you start the weekend, download it, update your servers and go home. Another crisis solved…

Oh yeah, if you want to reread the whole story, check out Scott Guthrie’s blog: