As the installation of SQL Server 2008 Management Studio can be rather confusing, this Installation Walkthrough by Kevin Van Dyke will help you to get the job done.
Thursday, December 24, 2009
In Visual Studio Team Foundation Server 2010, the Team Web Access site is installed by default. At first things looked fine when I browsed to our company’s TFS website:
But after a few seconds, I realized that something was wrong. The Documents and Reports tabs are missing!
I immediately started checking all the possible but everything looked fine to me. Some searching on the web brought me the answer. On the MSDN forum is mentioned that the Documents and Reports tabs are removed by design from Web Access 2010. The project team decided that the Project Portal is the best place to work with documents and reports and provides better features that weren't available in Web Access.
Tuesday, December 22, 2009
After debugging some unit tests, I couldn't rerun my tests. I get a big warning icon and a very useful message telling me that the tests were not executed. After scrolling through the endless list of test run details I found following error message at the bottom;
“Code coverage collection error: The Visual Studio performance and coverage logging engine is already running on the computer. Therefore, the test run cannot continue. Close the pending performance or code coverage session and then rerun the tests.”
The solution is to shutdown the VSPerfMon.Exe process which gets left in memory when you debug a unit test and stop the debugger before the test finishes.
More information can be found here at Microsoft Connect
A ScrumMaster as one of the most important roles in the scrum process. He is the advocator of Scrum inside the organization. He ensures smooth functioning of the team by eradicating impediments and keeping the team shielded from external distractions. Bur can it be that the Scrum Master becomes the biggest impediment?
On InfoQ an interesting summary is written collecting a lot of posts around the possible failure of the ScrumMaster role. I especially think that the multiple hats most Scrum Masters have to wear make it very difficult to do their job well. Can you think of some reasons why being a ScrumMaster can be hard or even impossible?
Sunday, December 20, 2009
One of the subjects that always returns when you start discussing with your database administrator about whether or not you should use stored procedures are Query Execution Plans. Damien Guard did a very good developer focused introduction to query plans.
Go check it out!
If you want to monitor the traffic to a locally hosted webservice/ website you can use Fiddler. Fiddler is a Web Debugging Proxy which logs all HTTP(S) traffic between your computer and the Internet. As Internet Explorer and the .NET Framework are hardcoded not to send requests for localhost through any proxies, and as a proxy, Fiddler will not receive such traffic.
A simple workaround is to use your machine name as the hostname instead of localhost. So, for instance, rather than hitting http://localhost/helloworldservice.svc , instead visit http://machinename/helloworldservice.svc.
More info on the Fidller website.
With Visual Studio 2010 a new way of deploying your web applications becomes available. In Visual Studio you just have to create an MSDeploy package. This package can than be deployed on your IIS 6.0 and 7.0 webserver using a command-line tool called msdeploy.exe.
It supports moving configuration, content, SSL certificates and other types of data associated with a web server. You can choose to sync a single site or the entire web server. Flexibility is guaranteed by using a customizable manifest file. You can also skip sites or other objects, or you can perform regular expression replacements during a sync (like changing the home directory on the destination machine).
MSDeploy will make it easy to version your web applications (including allowing you to quickly roll back to previous versions), as well as automatically provision them across multiple servers. It also enables the full automation of deployment tasks (including via both command-line and PowerShell scripting APIs).
As MSDeploy sounds rather boring, you’ll find the tool as the Web Deployment Tool. Next to the MSDeploy package, web deployment is also improved thanks to following features:
Web.Config Transformation – XML Document Transform (XDT) will allow you to transform your development time web.config file to production/deployment time web.config file. The transformation is controlled by web.config TRANSFORM files named web.debug.config, web.release.config etc. The naming of these files is tied to the MSBuild configuration you are trying to deploy. The transform file will need just the changes that you really want to make to your deployed web.config.
DB Deployment – VS 2010 allows you to deploy your application along with all of its dependencies including database dependencies on SQL Server. Just by providing the connection string of your source database VS10 will automatically script its data/schema and package it for deployment. VS will also allow you to provide custom .sql scripts and also sequence them correctly to run on the server. Once your DB is packaged along with your IIS Settings and web content you can choose to deploy it to any server by providing the connection string at the install time.
1-Click Publish - VS 2010 will allow you to not only package your web applications with all of its dependencies but also use IIS remote management service to publish the application to remote server. VS 10 will now allow you to create a publish profile of your hoster account or of various testing servers and save your credentials securely so that going forward you can deploy to any of these publish profiles with just one click using Web One Click toolbar. With VS 10 you will also be able to publish using MsBuild command line so that you can configure your team build environment to include publishing in continuous integration model.
Friday, December 18, 2009
Microsoft took a brave and (if you ask me)wise decision. They postponed the launch of Visual Studio 2010 and .NET Framework 4.0 for a few extra weeks based on some negative customer feedback about performance.
As we are using VSTS 2010 Beta 2 and TFS 2010 Beta 2 today, I’ll have to agree that performance is still an issue. Especially our TFS environment gets unpredictable drops in performance. Not the best thing that can happen when you’re trying to get some code released…
I hope that this will buy Microsoft some extra time to fix all this issues because I’m really loving all the new great features that come with the 2010 wave!
Thursday, December 17, 2009
If you are using Team Foundation Server 2010 today and you’re looking for the very useful TFSAdminUtil command line tool, search no longer, because it’s no longer there. Instead there is a new tool available, the TFSConfig tool. It offers the same features as the TFSAdminUtil and many more…
The ADO.NET Data Services team announced that the “Data Services Update for .NET Framework 3.5 SP1” (formerly known as “ADO Data Services v1.5”) has been released and is available for download. If your target is Windows7 or Windows 2008 R2 you can pick it up here. For all other OS versions you can get the release from here.
The list of new features is great; projections, data binding, row count, enhanced blob support, and especially the new "Data Service Provider" Interfaces look promising!
As you probably know there is already a Linq to NHibernate provider available but it’s functionality is rather limited and far from complete.
In the meanwhile a separate project is going on where Steve Strong is building a new provider from scratch based on AST. This provider will give a lot of extra functionalities and features.
Today Steve announced the latest updates on his work and it looks very promising. I cannot wait to start replacing the current provider. Of course, this is all in the trunk, so anyone wanting to play either needs to get the trunk source and build it, or take the much easier option of having Horn do the work. Horn builds the trunk on a daily basis, so look for a package built after around 2300GMT on the 16/12/2009 (the package URL on Horn has the datetime stamp in it, so it's pretty easy to spot).
Wednesday, December 16, 2009
To get Intellisense to work again, reset Visual Studio settings (Tools | Import and Export Settings):
then go ahead and reset to your preferred default profile.
One of the nice features of the Performance Monitor tool is that you can read performance counters on remote machines. That is of course when it works!
When I tried to monitor the performance counters on a remote computer in Perfmon, I received an error message that resembles the following:
Unable to connect to machine
To resolve this issue, check the following actions.
Action 1: Verify that the Remote Registry service is running on the remote computer
- Click Start, click Run, type cmd, and then click OK.
- At the command prompt, type tasklist -svc, and then press ENTER.
Verify that a Svchost.exe host process is running that has RemoteRegistry in the Services column. If the service is not running, go to step 3.
- At the command prompt, type net start RemoteRegistry, and then press ENTER.
Action 2: Verify that you have the required permissions
- You must be a member of the Administrators group on the remote computer.
- If the remote computer is running Windows Server 2003, you must be a member of the Performance Monitor Users group to monitor performance counters.
- If the remote computer is running Windows Server 2003, you must be a member of the Performance Log Users group to log the performance counters. Additionally, you must use the Run As command to configure the logging process to run under the Administrators group or under the Performance Log Users group.
Monday, December 14, 2009
If you like nasty errors, this is one! It means more or less that the .NET framework itself has crashed. I have not seen this kind of errors a lot(luckily). Everytime we called a specific service, we succeeded in crashing the .NET framework. After some investigation, we started to suspect the usage of IEnumerable<T> in one of our services. And finding this MS Connect call confirmed this suspicion.
The fault could be found in the DataContractSerializer class. Microsoft indicated that the DataContractSerializer issue is fixed in .NET 4.0, but there is not a hotfix available for .NET 3.5sp1 at this time.
The workaround they proposed was to place all assemblies that contain types "T" used in contracts that have IEnumerable<T> into the GAC. (In other words, if your contract has IEnumerable<T> elements, then all types T have to be strong-named & in the GAC.)
Why does this work? The bug with DataContractSerializer apparently does not manifest itself when assemblies are loaded as "domain neutral" (shared across all appdomains.) You can force strong-named/GAC'd assemblies to be loaded as "domain neutral" by using the LoaderOptimization attribute. But if you're hosting in IIS, you are automatically getting LoaderOptimization(LoaderOptimization.MultiDomainHost) behavior for your application. If you're not hosting in IIS, this bug doesn't seem to appear at all.
Today I arrived at work noticing that ALL our IIS applications were failing. As this not only influences our test and production enviroment but also our TFS server, nobody was able to do any kind of work.
After checking out the system logs, I saw that a lot of updates where installed last weekend. It seems that the applications pools were unable to start after applying KB 973917 on Windows Server 2003 to add support for Extended Protection in Windows Authentication. The root cause of this issue is machines being in an unsupported state where SP1 version of IIS binaries exist on an SP2 installation. Product support has released KB 2009746 on how to resolve this issue. The summary of the resolution is to reinstall SP2 to such machines to update all IIS binaries to the SP2 version.
You can get SP2 for Windows Server 2003 from the appropriate link in the article here:
Saturday, December 12, 2009
Together with our test team we’re trying the new Visual Studio 2010 Test and Lab Manager. One of the questions of our lead test manager was how he could remove some useless test cases. As test cases are represented by work items, the answer is not so easy. Work Items by nature can’t be removed and this remains the same in Visual Studio 2010.
Although it’s a tad inconvenient, you can delete work items from TFS by installing the Team Foundation Server Power Tools . Of the many features available as part of the power tools, there is a command called destroywi that can be used to delete work items. For example, to delete the work item ID 1234, use the command:
1: tfpt destroywi /server:tfs-dev /workitemid:1234
Use this feature at your own risk. Of course from a user perspective this is not the best solution. So better is to just delete the test case from the suite by selecting it, and press the delete key, or the delete button on the toolbar. You can then edit the test case, and move it to 'closed'.
Visual Studio 2010 supports more platforms and languages out of the box than any previous version of Visual Studio. However, one of Visual Studio’s greatest strengths isn’t in what ships with it, but how it can be extended to meet your individual development needs. Visual Studio 2010 exposes new APIs for building your extension and provides an ecosystem for publishing, sharing, and finding new extensions.
Quan To, a program manager for the Visual Studio Platform Team shows how easy it is to build and publish an extension in VS2010.
NHibernate is a very powerful Object Relational Mapper (ORM). But for a beginning NHibernate user, things seem very complex. One of the difficult parts is the creation if XML mapping definitions by hand, an error-prone and time-consuming task. Visual NHibernate makes it easier and quicker to create and maintain NHibernate projects - even very complex ones. Point it at your existing projects and start modelling them right away - visually.
The current feature list(it’s still in beta) supports:
Visually design and inspect all of NHibernate's mapping scenarios:
- Single entity to single database table
- Single entity to multiple database tables
- Single table to multiple entities
Supports all NHibernate collection types:
Supported mapping types:
Mappings can be:
Visual NHibernate also supports Components.
Download the free trial and start experimenting.
If you want to experiment with the new extensibility options (MEF intregation, Wix deployment, WPF support,…) in VS 2010, download the Visual Studio 2010 SDK. Thanks to the removal of all the documentation from the binaries, it’s a much smaller download.
The Visual Studio 2010 SDK includes project templates that help developers create tool windows, menu commands, isolated shell projects, and editor extensions. The editor extensions include text adornments, colorizers, and margins. It also includes build tasks that help developers build and debug extensions. Building and debugging are managed in a second instance of Visual Studio named the experimental instance. The experimental instance provides a test bed for extensions without changing the primary installation of Visual Studio.
At one of my clients, I have the ‘luck’ to work with a DB2 database environment. After upgrading my system to Vista, I saw this heuristic processing error below:
[IBM][CLI Driver][DB2] SQL0998N Error occurred during transaction or
heuristic processing. Reason Code = "16". Subcode = "3-8004D00E".
From previous experiences, I learned it is caused by the distributed transaction coordinator that is always used when you’re opening a DB2 transaction inside a transactionscope. By default in Vista, MSDTC settings are all locked down. A blog post here describes how to use the dcomcnfg command to enable Inbound, Outbound and enable XA Transactions.
Enabling XA transactions and both inbound and outbound connections solved the problem.
Software developers don’t practice enough. Most of our learning takes place on the job, which means that most of our mistakes get made there as well. Other creative professions practice: artists carry a sketchpad, musicians play technical pieces, poets constantly rewrite works. In karate, where the aim is to learn to spar or fight, most of a student’s time is spent learning and refining basic moves. The more formal of these exercises are called kata.
Dave Thomas took the idea of coding practice and made a series of what he calls Code Kata, which are small, thought-provoking exercises that programmers can do in the language of their choice. Each kata emphasizes a specific technique or thought process, providing a concrete flexing of one’s mental muscles.
All kata are available for free on his weblog (http://codekata.pragprog.com/). On the weblog, you’ll also find links to a mailing list and to others’ solutions to the exercises along with discussion about how the problems were solved.
Thursday, December 10, 2009
By default if documentation is enabled, Visual Studio will give you warnings for each public class, method, field and so on that’s not documented. This is certainly useful for places where you’re writing some complex logic, but for some other places in your code and for generated code documentation isn’t always necessary. You can let the compiler ignore certain warnings for specific parts of your code by adding a pragma to your code file.
The code below has a pragma added that tells the compiler not to raise warnings with id 1591 (missing comments)
1: #pragma warning disable 1591
To re-enable this warning after a code-block, you can add following pragma:
1: #pragma warning restore 1591
One of the great new features in Visual Studio 2010 is "Lab Management". It enables you to automate the setup and configuration of test environments, saving you a bunch of time doing it every time you have a new build you want to test.
Installing and configuring is not very easy, Microsoft did a lot of work to make it much easier to get up and going, but you will have to invest time and hardware to get everything up and running. The lab management team did a blog series to help you:
Part 3 – Configuration continued: http://blogs.msdn.com/lab_management/archive/2009/11/20/getting-started-with-lab-management-part-3.aspx
Part 4 - End to end workflow: http://blogs.msdn.com/lab_management/archive/2009/11/23/getting-started-with-lab-management-part-4.aspx
If you care about testing and if you try to use practices like TDD tooling support can help a lot. One of the tools that’s out there is Fitnesse.
Fitnesse is a tool build on top of the Framework for Integrated Testing (FIT), an acceptance testing framework originally developed for Java by Ward Cunningham. One of the central ideas of FIT was to promote collaboration and allow customers and business analysts to write and verify tests. FIT makes it easy to run tests, but does not provide a way to create them. The original idea was to write tests in Word, Excel, or any tool that can output HTML.
FitNesse is a web wiki front-end to FIT developed by Robert Martin and Micah Martin from ObjectMentor. If you are interested in starting to use it, certainly check the free Fitnesse ebook.
Saturday, December 5, 2009
ClickOnce deployment enables you to deploy self-updating Windows and console applications that can be installed, updated, and run from a Web site. For more information, see ClickOnce Security and Deployment. In Visual Studio 2010, there are some small but useful improvements.
Starting in Visual Studio 2010, you can target .NET Framework 4 or multiple versions of the .NET Framework in your ClickOnce deployment. You can also troubleshoot installation issues by using enhanced logging and you can create a custom installer.
For Office solution developers, there are additional ClickOnce enhancements, such as deploying multiple Office solutions in a single ClickOnce installer and performing additional actions after the ClickOnce installer is finished.
For WPF XAML browser applications (XBAPs), you can request elevation of privileges with ClickOnce. For more information, see WPF XAML Browser Applications Overview.
I especially like the custom installer option. Now you can implement custom user experience during installation, including custom dialog boxes for security and maintenance operations.
When you try to call a web service that uses a self-signed certificate from a client application you get the following error:
The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.
This is because your system marks the certificate as invalid. There are 3 tests that must be checked before a certificate is marked as valid:
- The certificate must be issued by a trusted certification authority.
- The certificate is not outdated.
- The hostname must match the certificate subject.
If one of those three tests return false then the certificate is marked as invalid. When you are just testing your application, you can make your client proxy ignore these tests and just call the service.
Therefore create a class file that contains the following code:
1: using System.Net.Security;
2: using System.Security.Cryptography.X509Certificates;
4: class Certificates
6: public static bool ValidateRemoteCertificate(object sender, X509Certificate certificate, X509Chain chain, SslPolicyErrors policyErrors)
8: //Return True to force the certificate to be accepted.
9: //Needed so that calling web services with self-signed certs will work.
10: return true;
Afterwards add the following lines to the Application_BeginRequest event in the global.asax file or the OnInit event of any page that needs to call a web services that uses a self-signed certificate.
1: using System;
2: using System.Net;
3: using System.Net.Security;
5: protected void Application_BeginRequest(Object sender, EventArgs e)
7: ServicePointManager.ServerCertificateValidationCallback = new RemoteCertificateValidationCallback(Certificates.ValidateRemoteCertificate);
Of course, ignoring the certificate errors within the code could open up a security risk depending on what it is being used for.
Last week, I was installing a web application on our 64-Bit webserver. After deployment I got following error when trying to load the website:
Could not load file or assembly 'Name' or one of its dependencies. An attempt was made to load a program with an incorrect format.
This error occurs when you have the following settings:
- IIS running on a 64-Bit Operating System
- The Assembly DLL in question has been compiled for 32Bit - check the project properties under the build tab, if under platform target you have x86 then it is 32-Bit only.
By default a 64-Bit IIS machine is using an Application Pool that is not allowing a 32-Bit DLL to be processed. You have to enable 32-Bit support for the Application Pool yourself. Therefore open IIS Manager, find the application pool for the site and select the Advanced Settings.
Notice that there is an option called "Enable 32-Bit Applications" . Enable this and you should be good to go.
Wednesday, December 2, 2009
The TFS Power Tools finally made it into a 2010 version. Brian Harry wrote about them a couple of weeks ago: http://blogs.msdn.com/bharry/archive/2009/11/18/tfs-2010-power-tools-coming-soon.aspx. As most of the power tool features moved into the final product, there aren’t a lot of feature enhancements.
Here are the links:
1. TFS MSSCCI Provider: http://visualstudiogallery.msdn.microsoft.com/en-us/f959ea32-5ac3-424a-a709-5001a158ebe8
If your SQL Server Transaction logs are exploding, you can shrink them using following commands:
Check first if their are any pending transactions:
1: DBCC OPENTRAN(<TransactionLogName>)
If there are no pending transactions, you can safely execute following command:
1: USE DatabaseName
5: DBCC SHRINKFILE(<TransactionLogName>, 1)
7: BACKUP LOG <DatabaseName> WITH TRUNCATE_ONLY
9: DBCC SHRINKFILE(<TransactionLogName>, 1)
Configuring a new laptop comes with a lot of work. You have to install all the software you are used to, apply all the updates, change the configuration and so on. So after installing Visual Studio 2008, I continued with the SP1 upgrade. But after the installation was done, I got the following error when I tried to open a work item in Team Explorer:
Could not load type 'Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemTypeDeniedOrNotExistException' from assembly 'Microsoft.TeamFoundation.WorkItemTracking.Client, Version=126.96.36.199, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'.
As the TFS administrator inside my organization, I had seen this error mentioned before, so I knew how to solve it.
Running the VS2008 SP1 install for a second time will patch Team Explorer up to the required level.
I think we all have written bad code before. And of course, we all want to rewrite this code if we could have a second chance. But most of the time, this is not possible. So you have to deal with the fact that this bad code is still there (and hunting you down!). A horrible idea that keeps you awake at night.
But from today on, you no longer have sleepless nights as you could do something simple and easy, buy of your guilt!
Go to codeoffsets.com, buy bad code offsets and free your mind from this trouble. Your money will be well spend on supporting open-source projects to the greater benefit of everyone. This is not a joke!
In IIS 6.0, creating a self signed certificate was not available out-of-the-box. You had to use tools like SelfSSL.exe to get the job done. In IIS 7.0 this feature is standard available and very easy to use as you can see in the Tip/Trick: Enabling SSL on IIS 7.0 Using Self-Signed Certificates blog post by Scott Guthrie.
After configuring bindings in IIS 7.0, I had to do the same thing in IIS 6.0. In IIS 6.0 it’s a little funkier than just clicking on the right action link.
To configure this for SSL host headers in IIS 6.0 click Start, click Run, type
in the Open box, and then click OK.
Type the following command at the command prompt:
1: cscript.exe %SystemDrive%\inetpub\AdminScripts\adsutil.vbs set /w3svc/<site identifier>/SecureBindings ":443:<host header>"
where site identifier is the unique id that every website gets in IIS(if you have only one site, it’s 1) and host header is the host header for the Web site, for example, www.microsoft.com.
Tuesday, December 1, 2009
Yesterday the Scrum For Team System template Version 3 was released. This version works for Visual Studio 2010 Beta 2 and brings a lot of new features and possibilities to the Scrum template.
The linked zip files contain the latest version of the Scrum for Team System process template and client side tools.
I especially like the Scrum masters workbench.
Sunday, November 29, 2009
My new laptop is a Dell Latitude E6500. As I was installing some software, I noticed that the touchpad was very slow. Even after changing the mouse speed to the maximum, I still had to do multiple sweeps to get the cursor from one side of the screen to the other. It was really annoying me, so I started to look for a solution.
On the Dell website under the Input Device Drivers section, I found a driver update for the Touchpad. This replaces the default mouse driver with a touchpad specific driver. It makes working with the touchpad a lot smoother and now it’s also possible to disable the touchpad when an external pointer device is attached.
Thank you, Dell!
Friday, November 27, 2009
After a first small update some weeks ago, the NHibernate team released a second update for NHibernate 2.1 last weekend.
NHibernate 2.1.2 can be downloaded here. The list of changes is rather small:
* [NH-2011] - Many-to-many inside a component will not be saved when using SaveOrUpdateCopy or Merge
* [NH-2022] - Allow overriding in Query By Example
* [NH-2007] - SesssionIdLoggingContext patch for big resultsets
* [NH-2019] - Clarification about the use of <import> for polymorphic queries
By default, the TFS build server will execute a full build. This means that the existing workspace is cleaned up and all the source code is loaded from the TFS server. When your network connection is slow or when you have a lot of code on your build server, this can take a lot of time. To optimize this experience you can choose to execute an incremental build instead of a full build.
To specify an incremental build, you must set the SkipClean, SkipInitialiseWorkspace, and ForceGet properties in the TFSBuild.proj file.
Each of these properties configures a step to enable the incremental build experience:
- When Team Foundation Build performs a full build, it cleans the files in the intermediate build folder and sources folder in Clean target. In an incremental build this target must be skipped because these files must be intact for the next build. To achieve this, set the property SkipClean to true.
As a part of a full build, Team Foundation Build deletes the old, and recreates a new workspace to clean and synchronize the sources files. In an incremental build, this target must be skipped also. To achieve this, set the property SkipInitializeWorkspace to true.
In the Get task, Team Foundation Build by default retrieves new source files and overwrites the existing files in the build workspace. To only retrieve the changed files, set the property ForceGet to false.
To set all of these properties, add the PropertyGroup definition to the end of the TFSBuild.proj file, before the closing </project> tag.
Jens Andexer Executive IT Architect @ IBM and Willem Bekker, Head of Architecture Initiatives @ Standard Bank talk about Service Oriented Architectures and their characteristics. They focus not only the good qualities of this style of architecture but also about the bad and even the ugly parts. If you want to look beyond the hype and make an informed decision then this article is a must read!
A final version is now available of the Microsoft Patterns & Practices Application Architecture Guide(second edition). It contains the architectural recommendations of the P&P team for the Microsoft application platform.
Here are the relevant links:
- HTML version of Microsoft Application Architecture Guide, second edition.
- PDF version of the Microsoft Application Architecture Guide, second edition.
- Printed version of the Microsoft Application Architecture Guide, second edition.
- Knowledge Base for the Microsoft Application Architecture Guide, second edition.
Today I was configuring my new laptop. It was filled with a lot of software out of the box, but as almost every IT guy, I’m not that happy with the default configuration. So I’m installing a dual boot with Windows Server 2008 R2 and Hyper-V. Hosting my VPC’s using full PC power is a tempting idea.
The first thing I had to do was repartition my hard disk. Used to work with Windows XP, I followed this Repartition your hard disk on-the-fly with Windows Vista article to guide me through the different steps.
Wednesday, November 25, 2009
If your config files start growing too big, you can split them up into smaller pieces by using the configSource attribute.
For example, if you want to extract your WCF services configuration to a dedicated configuration file, place the following in your web.config:
3: <system.serviceModel configSource="services.config" />
Create then a separate services.config file with the following structure:
3: <!-- Place your service settings here -->
Make sure you use .config as the extension of your files so they cannot be served to the browser. Avoid .xml for example or your files can be available to prying eyes.
Allthough you can connect to a TFS 2010 server with your existing Visual Studio 2008 (SP1) environment, most features are not available to you. To improve your experience you can install the Visual Studio Team System 2008 Service Pack 1 Forward Compatibility Update for Team Foundation Server 2010. This is an update for the 2008 SP1 version of Visual Studio Team System Team Explorer and allows the 2008 SP1 version to work with the Team Foundation Server 2010. This update will allow teams to move forward and use the Team Foundation Server 2010 server even if part of the team continues to use the Team Explorer 2008 SP1 client.
Before you start installing big (and expensive) tools to trace performance issues, it's a good idea to first have a look at Performance Monitor a/k/a PerfMon. It's an amazing tool that goes far too often unused available freely on every windows machine. Through a lot of different performance counters you can monitor different aspects of your system. To open PerfMon, just go to the Start Menu, choose Run and type perfmon.
If you want to find out how the system is performing, then the following counter give you a good overview of general activity of the system.
- Processor utilization
- Processor\% Processor Time\_Total - how 'loaded' is the CPU at any given time. Don't confuse 100% processor utilization with a slow system though - processor queue length, mentioned above, is much better at determining this.
- Memory utilization
- Process\Working Set\_Total (or per specific process) - this basically shows how much memory is in the working set, or currently allocated RAM.
- Memory\Available MBytes - amount of free RAM available to be used by new processes.
- Disk Utilization
- PhysicalDisk\Bytes/sec\_Total (or per process) - shows the number of bytes per second being written to or read from the disk.
- Network Utilization
- Network Interface\Bytes Total/Sec\nic name - Measures the number of bytes sent or received.
You can find more info about the usage of the performance monitor and some other useful performance counters in this Windows perfmon- top 10 counters post.
Monday, November 23, 2009
Today traffic was killing me. So I tried to get the best out of it and started to listen to some podcasts. One very interesting one was a panel discussion on DotNetRocks about “Is software development too complex?”
I was thinking if the question is not the other way around. Aren’t we making software development too complex?
I cannot count the times I have seen a complex SOA architecture where a simple 2 tier application would suffice. Aren’t we all developers who like to try out the latest fun stuff even if it is not appropriate or a real business requirement for the current project? As most IT people have a technical background, we seem to focus mostly on technical details, sometimes forgetting what the customer actually needs.
As I have fallen in this trap before, I’m always careful and thinking twice before I introduce a new tool/technology on my projects. (Speaking about new tools, I’m loving the Spark view engine ;-))
If you want to know the latest buzzwords in town as an architect, check out the following article: 10 Must-Know Topics For Software Architects In 2009
After the installation of TFS 2010, I was looking through all the parts to check if the installation was successful.
When I browsed to the report manager website, I saw something strange: only the header was visible. As I had this problem before I knew it had something to do with the security settings. So I opened up SQL Server Management Studio, connected to the reporting services server and checked the security settings. But everything seemed OK.
After some research I found out that even if you are logged on with administrative credentials, you might have trouble accessing Report Manager or the http://localhost/Reports site on a computer that is running Windows Server 2008 or Windows Vista. You might need to add those sites as Trusted Sites in Internet Explorer or start Internet Explorer as an administrator. To start Internet Explorer as an administrator, click Start, click All Programs, right-click Internet Explorer, and then click Run as administrator.
More info here.
If you have create a simple WCF service “HelloWorld.svc” and you load it up inside your Visual Studio, you get a result like this if metadata publishing is enabled.
The host name “mycomputer.private.mydomain.com” is automatically picked up by WCF. Of course if this service is hosted on a public server, you don’t want to expose the server name to the consumers of your service. In the real production environment, you would want to use a public host name or even an IP address in the address.
Therefore open up the IIS manager, select the website of your choice and click on the Bindings link in the Actions part.
In the window that shows up add the hostname you want to use(for example www.microsoft.com). The first time I did this I got following exception:
This collection already contains an address with scheme http. There can be at most one address per scheme in this collection.
Parameter name: item
This error was caused by the fact that I now had 2 addresses pointing to the same scheme. Therefore instead of adding a second hostname, change the hostname on the existing http scheme. This should do the trick…
Sunday, November 22, 2009
Spending a lot of time inside MSBuild scripts, optimizing our continuous integration processes, custom build tasks are very handy to simplify common actions.
Creating your own build task is easy, if you’re interested take a look at How To: Write a Task. Of course, you’re probably not the first one who creates a specific task. Before you start creating your own, check out the MSBuild Extension Pack and the MSBuild Community Tasks.
After completing the Certified Scrum Master training by Mitch Lacey at Ordina, I received an email last week by the Scrum Alliance confirming my certification. I strongly believe that Scrum principles like openness, respect, commitment can benefit any (software) project. Now convince the rest of the world…
Thursday, November 19, 2009
More info about all the new features here.
Tuesday, November 17, 2009
As we were deploying a WCF service via web setup projects on Windows Server 2008 with IIS 7.0, the setup failed with the installer saying
"The installer was interrupted before [my app] could be installed. You need to restart the installer to try again."
Of course after restarting the installer, the exact same error occurred again. :-)
It seems that the MSI package is using the Metabase feature of IIS6. As this is not installed by default on an IIS7 machine, you get an error. Installing the IIS 6 Metabase Compatibility Component solves the issue. (it's situated under Role Services in the Server Manager)
Today I was configuring a clean Windows Server 2008 system. When I deployed a a WCF Service on IIS 7, I got the following message:
Server Error in Application “Default Web Site/...”
HTTP Error 404.3 - Not Found
The page you are requesting cannot be served because of the extension of the configuration. If the page is a script, add a handler. If the file should be downloaded, add a MIME map.
The exception explains itself. IIS couldn’t find a correct handler for the *.svc extension for our WCF service.
Execute the following steps to make it work:
- Start the command window (cmd) as an Administrator.
- Navigate to:
C:\windows\Microsoft.Net\Framework\v3.0\Windows Communication Foundation\
- Run the following command: ServiceModelReg –i
Today, I was installing SQL Server 2008 on a Windows Server 2008 R2 server. The setup detected that .NET 3.5 was missing and asked me to install this prerequisite. But after clicking OK, the setup prompted an error message of "You must use the Role Management Tool to install or configure Microsoft .NET Framework 3.5".
Some things I noticed when installing the TFS 2010 Build environment for my company.
First of all your existing TFS 2008 build agents will not work with a TFS 2010 server. You’ll need to install the TFS 2010 build service. It’s included in the same setup as the TFS application tier and you should set it up on a dedicated build machine. But you don’t need an extra build server, you can install the TFS 2010 Build Service on your existing TFS 2008 build machine. Even though they both default to the same port (9191), they can share that port without any problems.
- If you install the TFS 2010 build service on a clean machine, it will install the .NET 4.0 Framework which includes MSBuild. It will also install the components necessary to support the following:
- Code Analysis
- Unit test execution with MSTest
- Architectural layer validation
- Database schema deployment and test data generation
- Althought the build definitions in TFS 2010 only use MSBuild for source code compilation and use Windows Workflow to orchestrate everything else that happens during the build process, you don’t have to recreate all of your existing build definitions. TFS 2010 includes a special Upgrade build process template that mimics the behavior of TFS 2008 builds by invoking MSBuild on your existing TFSBuild.proj file.
So I have to admit, Microsoft did a great job supporting my existing build scripts.
Monday, November 16, 2009
After the Mexican flu, another international phenomenon is coming to Belgium. Visug achieved in getting Scott Guthrie, Microsoft Corporate Vice President and .NET guru, live in Belgium! Scott will be speaking about Visual Studio 2010, .NET 4 as well as about the latest on web development with ASP.NET 4.0 and Silverlight.
Got burning questions for Scott? This is your chance! After the talks he will be taking the time to answer questions from the audience. E-mail your questions upfront to email@example.com or ask them directly during the Q&A time.
December 4th – 13:30 to 17:00
· Visual Studio 2010
· .NET 4 and Web development
· ASP.NET 4
One of the great stuff that Microsoft announced during the TechEd 2009 conference at Berlin was the acquisition of the Teamprise Client Suite from Teamprise. With this step heterogeneous development on the Team Foundation platform became once again easier. Free upgrades will be provided for all customers who are using Teamprise today.
Teamprise Client Suite consists of 3 components:
- Eclipse plugin - Allows developers to perform all of their source control, bug tracking, build, and reporting operations from within Eclipse and Eclipse-based IDEs, such as Rational Application Developer, JBoss, BEA Workshop, and Adobe Flex Builder. It integrates into the menu system of Eclipse as a standard Team Provider plug-in, but also provides developers with specific views and forms for interacting with the Team Foundation Server. Developers using the Teamprise Plug-in for Eclipse have the ability to take part in the entire software development process in use by their organization without leaving the comforts of their development environment.
- Stand-alone Explorer - Combines all of the functionality available to Eclipse developers using the Teamprise Plug-in into a stand-alone, cross-platform GUI application for team members working outside of a development IDE. Perform source control operations, browse the Team Foundation Server repository, edit bug reports, run work item queries, monitor builds, and view project reports all from within an application that has a native look and feel on the operating system you are using.
- Command line client - Provides a cross-platform, non-graphical interface to Microsoft's Team Foundation Server, for scripting and build scenarios or for developers who prefer a command-line interface. The command line interface is compatible with the current Microsoft supplied command line interface so scripts are interchangeable.
All of these components work on Windows, Mac, Linux and several flavors of Unix. In addition work is being done to explore providing these capabilities in mainframe environments to enable access to the Visual Studio ALM platform from there as well.
Sunday, November 15, 2009
As I was riding home last Friday from TechEd Berlin with my colleague Gill, we were wondering why some of our developers pursue excellence relentlessly while another hits the door at 5:01 and doesn’t think about software development until tomorrow morning.
I hoped for years that there must be some motivator, some technique, or some dynamic that will affect people positively to simply care. But even after providing incentives like salary, wonderful co-workers, free pizza and so on doesn’t seem to motivate everyone to care about excellence. In contrast to this I see people who don’t get these incentives but work very hard. So there must be something else that causes these people to ponder, read, and learn.
And this ‘something else’ is Passion. Loving software development, seeing software development as a craft, continuously looking for ways to grow as a developer are indicators that this passion lives inside you.
This leads me to one conclusion: incentives will get you behavior and results, but can’t create passion. That is something that is simply innately there or not there.
Data Access should be one of those problems that must be solved in software development a long time ago. Getting and putting data from a database was done 10 years ago and is still done today. But we keep struggling writing code to talk to our databases. Object relational mappers (ORM) are one step closer to writing more business oriented code and less infrastructure. As an ORM is only a tool and not the solution, Jarod Ferguson posted following tips to help you implementing ORM right:
Jarod continued his Data Access Tips post with a more detailed explanation about DTO's, DDD and the Anemic Domain Model.
One of the dangers of mockups, is that they can become too realistic. Customers tend to believe that the application is “almost done” when you’re showing a realistic image of your application. Same problem for developers who think that requirements are set in stone when you give them this kind of image. Therefore don’t make your mockups too realistic. It must be clear that it’s only a temporarily design and not a final version.
Two tools that can help you creating such an “unrealistic” look:
Saturday, November 14, 2009
As an architect and developer, I'm always looking for better ways to communicate with the business. Discussing specifications can be a challenging duty, especially if you end up with hundreds of pages with use cases, user stories and so on. It made me wonder if text is always the best medium to capture spec's.
How do I think about it today? As a rules of thumb I think that things that don't get communicated well in text shouldn't be forced into a text medium. Therefore I'm advocating alternatives like whiteboard designs, mockups,... Why writing a 10 page summary describing all the fields on a screen if one image can get you to the same conclusions? After all, it's all about using the right medium in the right context.
The NHibernate team released 2 weeks ago NHibernate 2.1.1, an update on NHibernate 2.1. As there wasn't much noise about it, I saw it only last week.
One very interesting features has been added. DetachedCriteria are finally supported on the IStatelessSession. Simple CRUD was never so easy...
If you're interested in the list with all changes, check out the release notes.
As the release of Entity Framework 4.0 is getting closer, it's time to have a look at the competition. After the release of NHibernate 2.1(.1), the NHibernate team is working on the next big release based on the .NET 3.5 framework.
A selection of some of the upcoming features in NHibernate 3.0:
- QueryOver: Allows the usage of ICriteria in a type safe way.
1: IList<Student> students = session.QueryOver<Student>()
2: .Where(s => s.Name == "Fabio")
3: .And(s => s.StudentNumber > 100)
- New Linq provider: A new linq provider is coming up fully based on AST. This will replace the current NHibernate.Linq 1.0 project.
- Strongly typed configuration for multiple parts of NHibernate(caching, sessionfactory,...)
- Better WCF integration for session management
I think we'll have a very interesting future in ORM land...
Monday, November 9, 2009
I started our first day at TechEd Berlin with a level 300 session about Entity Framework. A lot of great features were shown:
- N-Tier support(Self-tracking entities)
- Persistance Ignorance
- Code only
- Better design experience
- Model first
- T4 Template generation
- Lazy loading
- Better support for stored procedures
- Better support for SQL functions
Although I'm still a bit sceptical, I guess that this time it will be "usable" ;-)
As TechEd is not only a range of (hopefully) great sessions, it's also a place to meet some people and do some networking. So tomorrow I'll be doing a TechEd TechTalk together with my colleague Gill Cleeren. I will ask him very though and tricky questions about building business applications in Silverlight.
I'll have some difficult questions for him. And as everything is recorded, the video will become available at the TechEd website. I'll post the link when it becomes available.
Thursday, November 5, 2009
“D:\Builds\*\BuildType\TFSBuild.proj” (CompileSolution target) (1:5) ->
(CoreCompileSolution target) ->
C:\Program Files\MSBuild\Microsoft\VisualStudio\TeamBuild\Microsoft.TeamFoundation.Build.targets(978,5): error MSB3491: Could not write lines to file “D:\Builds\*\*.sln.Release.vsprops”. Could not find a part of the path ‘D:\Builds\*\*.sln.Release.vsprops’.
Turns out that some solutions where moved in the source control repository. After correcting the file location, everything was up and running again.
A more descriptive error message could have been usefull.
I cannot agree more with the statements mentioned in this post.
Thursday, October 29, 2009
People keep posting their best practices about ASP.NET MVC usage. Simone Chiaretta made a presentation about the stuff he finds important when using ASP.NET MVC.
Maybe it becomes time that I post my list of best practices? Will be continued...
As a frequent user of NHibernate, I find it a little bit disappointing that there's only one book out there, although it's a very good one.
Phill Haack wrote a blog post about quickly spawning a web server against a local folder to preview a web application. It adds a right click menu to start a web server pointing to any folder.
Today I had to configure an MSBuild script to read some data from a file. First I was looking for a custom task to achieve this goal. But I found that this functionality is out-of-the-box available in MSBuild.
You can read from a file into a variable using:
1: <ReadLinesFromFile File="c:\ReleaseNumber.txt">
2: <Output TaskParameter="Lines"
3: ItemName="ReleaseNumber" />
This will store the contents of the file "ReleaseNumber.txt" in a property "ReleaseNumber". You can then use this property in other places inside your build file:
You can also write to files:
1: <WriteLinesToFile File="Log.txt" Lines="This is a log value." Overwrite="false" />
Saturday, October 24, 2009
Type safety is important. Isn’t it one of the reasons why you should use a static typed language like C#? But if you’re using the INotifyPropertyChanged interface, you allready have left that type safety. Why? Because it expects you to pass on the propertyname. Easiest way to do this is just using a string parameter.
1: public event PropertyChangedEventHandler PropertyChanged;
3: private void NotifyPropertyChanged(String propertyName)
5: if (PropertyChanged != null)
7: PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
But what if you refactor your code and change the name of the property? The string value will remain the same and the compiler will not warn you. But your code will no longer work.
A solution could be the use of expression to pass on the propertyname in a type safe way.
1: protected void NotifyPropertyChanged(Expression<Func<T,object>> x)
3: var body = x.Body as MemberExpression;
4: string propertyName = body.Member.Name;
6: if this.PropertyChanged; != null)
8: this.PropertyChanged; (this, new PropertyChangedEventArgs(propertyName));
There is, however, an alternative that works on previous versions of .NET and doesn’t involve expression trees. It essentially involves creating a delegate of the target method, and using the delegate properties to get to the corresponding MethodInfo as mentioned on Daniel Cazzulino’s blog. I didn’t tried it out yet, but as soon as I created an example, I’ll update this post.
For one of my current clients, I’ll have to start looking at the Microsoft Visual Studio Tools for the Microsoft Office System(VSTO). I used to work with macro’s and the COM API’s for Office before. It was not really my best experience , so I was not too eager to start using VSTO.
How could I be more wrong? It’s amazing how good these tools are integrated in the Office system. I’m really starting to like it. Some invaluable resources that helped me a lot:
Microsoft plans to release significantly updated and improved versions of both the Winforms and ASP.NET Report Viewer controls in Visual Studio 2010. This is going to include the long awaited local mode support for the new report processing engine, originally released with SQL Server Reporting Services 2008. Most importantly, this provides RDL 2008 features (e.g. tablix, enhanced charts, gauge, rich text) in local mode without connection to a report server. If you wanted to use RDL 2008 features with the report viewer controls available before, server mode was your only viable option, because report processing is performed remotely on a Reporting Services 2008 server.
Below are some of the other new features or changes who are coming as well.
- Support for the 2008 RDL schema in local mode. This will give you all of the new features available in RDL in SQL Server 2008, including tablix, rich text, updated chart visualizations, gauge, and many others. The updated report design surface for local mode will also be included.
- Support for ASP.Net AJAX. The report viewer will use AJAX to update its various regions (report, toolbar, etc). You will also allow be able to include the entire ReportViewer control in an UpdatePanel.
- Significantly improved browser compatibility. A huge amount of effort is put into improving support across browsers.
- Usability and “look and feel” enhancements. The viewer got a minor facelift.
Everything becomes obsolete sooner or later. The same thing counts for programming languages. The lifecycle of a programming language can be divided in 7 phases:
Scot Allen is starting an in interesting discussion about the language phase where C# is today.
Can things only get worse after C# 4.0? Or will there still be some interesting evolutions for the C# programming language? Maybe the dynamic keyword is just a sign for what will be next…