Skip to main content

Posts

Showing posts from June, 2011

Exclude files from the JSLint validation process

JSLint is a must have tool for every web developer. It’s a JavaScript program that looks for problems in your JavaScript code. JSLint uses a professional subset of JavaScript, a stricter language than that defined by Third Edition of the ECMAScript Programming Language Standard . The subset is related to recommendations found in Code Conventions for the JavaScript Programming Language . The first time I ran it on my brand new ASP.NET MVC application, I was confronted with a lot of errors, most of them related to external plugins and the jquery library. As I am not planning to fix stuff in jquery, I needed a way to tell JSLint to ignore some files. Turns out that this is really easy:

Free TFS 2010 and Visual Studio 2010 Webcasts

Too hot to go outside? The upcoming weeks you don’t have to. Just stay inside and spend some time looking at the following webcasts: Testing SharePoint - Functional and Performance Testing for SharePoint based applications using Visual Studio 2010 June 29, 2011 During this free one hour webcast we will we'll demonstrate the SharePoint specific testing capabilities of Microsoft Visual Studio 2010 and Team Foundation Server . More and more SharePoint is being leveraged as an application platform within organizations for both internal and external systems. Visual Studio 2010 and Team Foundation Server 2010 include features specifically designed for testing SharePoint based development efforts. · Discover new Unit Testing features designed for use in SharePoint projects · Witness the use of IntelliTrace for advanced debugging of SharePoint for both developers and testers to help find root cause and create actionable bugs · See how Microsoft Test Manager can be used to man

Best practices to speed up your website

As mentioned by Jeff Atwood, ‘performance is a feature’ . Having a fast and responsive web application really makes the difference. But how do you get there? The best place to start is to have a look at the performance rules list created by the Exceptional Performance team at Yahoo. They have identified a number of best practices for making web pages fast. The list includes 35 best practices divided into 7 categories. Content Minimize HTTP requests Reduce DNS lookups Avoid redirects Make Ajax cacheable Post-load components Preload components Reduce the number of DOM elements Split components across domains Minimize the number of iframes No 404s Server Use a Content Delivery Network Add an Expires or a Cache-Control Header Gzip components Configre ETags Flush the buffer early Use GET for AJAX requests Avoid empty image src Cookie Reduce cookie size

Windows Azure: Free Ingress for all Windows Azure Customers

Last week Microsoft announced a change in pricing for the Windows Azure platform. For billing periods that begin on or after July 1, 2011, all inbound data transfers for both peak and off-peak times will be free. This will provide significant cost savings for customers whose cloud applications experience substantial inbound traffic, and customers interested in migrating large quantities of existing data to the cloud. Thank you, Microsoft.

Impress your colleagues with your knowledge about… Visual Studio wildcards support

Last week I discovered a useful feature in Visual Studio when you have lots of resources. It’s possible to use wildcards to include files in a project. Imagine you have a lot of files you want to add to your project, you can add them all manually or  you can edit the project file once to pick up all of the files and never worry about it again. How do you this? Right click on the project in Visual Studio and choose ‘Unload Project’. Right click on the unloaded project and choose ‘Edit Project’ Add an <ItemGroup/> block with the following: <ItemGroup> <Content Include="Content\Images\*.png"> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> </Content> </ItemGroup> Save and reload the project.  All files will be self-included. I know that by clicking “Show all” and selecting the files you want to add you get the same behaviour, but it’s still neat.

TFS Build: Cannot add custom activity to build workflow

After creating a custom Workflow 4 activity, I was planning to add it to our existing build process  template. However after adding the activity to the toolbox in Visual Studio, I couldn’t drop the activity on the workflow designer. The reason is that the workflow designer is not able to find the assembly containing the custom activity. There are multiple solutions out there to solve this problem. Probably the easiest one is adding the assembly to the GAC. Two other alternatives are: Adding the process template to a project and reference the assembly from this project. Adding the assembly to a location where Visual Studio will find it (e.g.  ..\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\PublicAssemblies). If you want to know more about customizing the build process template, I can recommend the following blog series by Ewald Hofman: Part 1: Introduction Part 2: Add arguments and variables Part 3: Use more complex arguments Part 4: Create your

Automating your deployment process using Team Foundation Server

You already have a build server up and running? Your code gets compiled and tested in a continuous way? You now want to take this one step further and automate your deployment? Let me introduce you TFS Deployer : TFS Deployer is a free tool available at Codeplex. It must be installed as an agent in your test and production environments and supports the execution of PowerShell scripts when an event happens in TFS. In particular it listens to build quality change notifications which occur when you change the quality of a build under the Team Builds node of Team Explorer. How does it work? The way it works is that when TFS Deployer starts up, the service subscribes to the build quality change event notification (1). Then the release manager, using Team Explorer updates the build quality indicator in the build store via Team Explorer (2), the build quality change event is fired (3), the TFS event service then looks up who is subscribed to the event (4) and then TFS Event Service no

Integrating Microsoft Dynamics AX with Team Foundation Server

If you are interested to know how to set up the version control system for Microsoft Dynamics AX 2009 using Microsoft Visual Studio Team Foundation Server, I’ll recommend having a look at the following whitepaper: http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=9915 . This white paper contains procedures to install and configure version control for Microsoft Dynamics AX 2009 using Microsoft Visual Studio Team Foundation Server. This paper is designed to be used in conjunction with the Team Server (ID Server) Setup white paper for Microsoft Dynamics AX 2009 and the Team Foundation Installation Guide for Visual Studio Team System .

Silverlight vs HTML 5

The last few months, a discussion is going on about what’s the best technology for the future, Silverlight or HTML5. This discussion was further encouraged by the recent announcements about Windows 8 focussing on HTML 5 and Javascript. I don’t want to start the discussion again but if you have to make a decision today, this is a post that can help you compare both technologies: http://blogs.msdn.com/b/eternalcoding/archive/2011/06/13/html-5-vs-silverlight-5.aspx

Generate an MDX query from Microsoft Excel

Writing your own MDX queries can be a painful experience. So most of the time I cheat . I’ll use Excel to connect to the Analysis Services server and use PivotTables to drag and drop the expected result together. But how can you get from this PivotTable to the corresponding MDX query that is used underneath? One way to do this is through the OLAP PivotTable Extensions : OLAP PivotTable Extensions is an Excel 2007 and Excel 2010 add-in which extends the functionality of PivotTables on Analysis Services cubes. The Excel API has certain PivotTable functionality which is not exposed in the UI. OLAP PivotTable Extensions provides an interface for some of this functionality. It also adds some new features like searching cubes, configuring default settings, and filtering to a list in your clipboard. A screenshot:

Run MDX queries from .NET

SQL Server Analysis services offers a powerful query language on top of your datawarehouse called MDX. But how do you use these MDX queries inside your .NET application? You cannot use your normal ADO.NET classes, instead you have to use the Microsoft.AnalysisServices.AdomdClient.dll. You can download the file from here: http://www.microsoft.com/downloads/details.aspx?FamilyId=228DE03F-3B5A-428A-923F-58A033D316E1&displaylang=en The code itself is somewhat similar: using (AdomdConnection conn = new AdomdConnection("Data Source=tfsDB;Initial Catalog=Tfs_Analysis; MDX Compatibility=1;")) { conn.Open(); var mdxQuery = new StringBuilder(); mdxQuery.Append("WITH "); mdxQuery.Append("SET [Last 4 weeks] as Filter([Date].[Date].[Date], [Date].[Date].CurrentMember.Member_Value < Now() AND [Date].[Date].CurrentMember.Member_Value >= DateAdd(\"d\", - 28, Now())) "); mdxQuery.Append("SELECT NON EMPTY Hierarchize(AddCalculatedMembe

C:\fakepath in Microsoft Test Manager

After recording a test using Microsoft Test Manager, I noticed that the test failed when I ran the test again. I could trace the root cause of the issue to a file upload I did during the test.  It seems like that when executing a test case with Microsoft Test Runner with a step where you have to select a file in a web application, you get “ C:\fakepath ” for the location of the file you selected. In my case I had to upload a document from the local disk. When I replayed this action I got the c:\Fakepath folder instead. Luckily a solution is available.   This Fakepath comes from Internet Explorer and is a security feature which hides the real path selected path. The workaround to get the action recording up and running is to add the site to the trusted sites in the security tab of IE options.

Azure Throughput Analyzer

When hosting a solution on Windows Azure, you can choose between multiple data centers around the world. Do you want to get an indication which data center gives you the highest throughput? The Microsoft Research eXtreme Computing Group cloud-research engagement team released a desktop utility that measures the upload and download throughput achievable from your on-premise client machine to Azure cloud storage (blobs, tables and queue). You simply install this tool on your on-premise machine, select a data center for the evaluation, and enter the account details of any storage service created within it. The utility will perform a series of data-upload and -download tests using sample data and collect measurements of throughput, which are displayed at the end of the test, along with other statistics. Download the tool here: http://research.microsoft.com/en-us/downloads/5c8189b9-53aa-4d6a-a086-013d927e15a7/default.aspx

ALM Practices every developer should know about

After watching one of Dennis Doomen sessions at NDC, I used the power of google to search his blog. I immediately noticed the following interesting blog series: ALM Practices every developer should know about . Although he doesn’t bring something new to the table, he put all this information nicely together. Have a look at the different parts: Ubiquitous Language User Stories Modeling the business domain with Domain Models Common Code Layout Coding Analysis & Guidelines Peer Reviews Checklists Work Item Tracking Automatic Builds & Continuous Integration Unit Testing & TDD Refactoring Reducing Technical Debt

Back from NDC 2011

I’m just back from NDC 2011 and again it was a fantastic conference. With great speakers like Scott Guthrie, Robert C.Martin, Douglas Crockford, my very own colleague Gill Cleeren and many more… the quality of the content was excellent. I learned a lot and went home with a bag of new ideas. For the people who couldn’t be there, all sessions are recorded and will be put online. I’ll post a link once they are available. A big thumbs up to the organizers. I’ve already reserved NDC 2012 in my agenda

New Windows Azure Pricing Calculator

As you have a lot of different options, calculating the cost of your Windows Azure solution seems a complex task. To help you select the right Windows Azure platform offer and estimate your monthly costs, Microsoft  launched a new pricing calculator .   The pricing calculator lets you pick compute, database, storage, bandwidth, CDN and Service Bus capacity based on your needs. Along with predicting your expected monthly costs, the pricing calculator then recommends the most cost effective offer for you to purchase Windows Azure platform services. I especially like the support for multiple currencies and the recommendation it does from the long list of offers . You can access the pricing calculator here: http://www.microsoft.com/windowsazure/pricing-calculator/ .

Entity Framework Connection Strings

One of the annoying things in Entity Framework is that you have to pass an Entity Framework connectionstring instead of a normal connectionstring. By default such a connectionstring is created for you if you are using the Entity Framework Designer. However if you don’t use the designer or start changing some stuff, it’s easy to get into trouble. Most of the time I end up with building the connectionstring from code: private static string CreateConnectionString() { SqlConnectionStringBuilder sqlBuilder = new SqlConnectionStringBuilder(); sqlBuilder.MultipleActiveResultSets = true; sqlBuilder.DataSource = "dbserver; sqlBuilder.InitialCatalog = "db"; sqlBuilder.UserID = "dbuser"; sqlBuilder.Password = "dbpassword"; EntityConnectionStringBuilder entityBuilder = new EntityConnectionStringBuilder(); entityBuilder.ProviderConnectionString = sqlBuilder.ToString(); entityBuilder.Metadata = "res://*/"; entityBuilder.Provider = "System.Data.SqlCl

Test Attachment Cleaner

After starting to use the Microsoft Test Manager tools in Visual Studio 2010, I noticed that our databases started growing a lot faster. The reason for this is that the execution of a Test Run in Team Foundation Server 2010  generates a bunch of diagnostic data, for example, IntelliTrace logs (.iTrace), Video log files, Test Result (.trx) and Code Coverage (.cov) files. The downside of these rich diagnostic data captures is that the volume of the diagnostic data, over a period of time, can grow at a rapid pace. The Team Foundation Server administrator has little or no control over what data gets attached as part of Test Runs. There are no policy settings to limit the size of the data capture and there is no retention policy to determine how long to hold this data before initiating a cleanup. A few months ago, the Test Attachment Cleaner for Visual Studio Ultimate 2010 & Test Professional 2010 power tool was released which addresses these shortcomings in a command-line tool. Using

Sharing Windows Azure Drives

One question I get a lot from customers is how to share a Windows Azure drive with read-write access among multiple role instances. At first you should think this is possible, but this is not the case. Windows Azure drives does not behave like a normal drive. It uses a concept of leasing, meaning that only one role instance can have write access to a drive. Drives can only be shared between multiple role instances as a read-only snapshot. This of course limits the usability of drives in our cloud applications. However the Windows Azure Storage team created a blog post explaining an alternative solution allowing you to still share a drive for read-write access. In this solution they are using SMB (Server Message Block) the same technology used to implement the Shared Folders/Printers/… functionality in Windows.

Could not load file or assembly ‘XamlServices’

Last week I did some test with the new Silverlight 5 functionality. Afterwards I reverted the solution back to Silverlight 4. But from that moment on, compilation started to fail with the following error messages: The strange thing was that all the necessary references were available. After trying a lot of things, I was able to solve the problem in the end by removing my userfile (*.suo) from the solution directory. Visual Studio recreated the file the next time I opened the solution and everything was working again.

Exporting Fiddler Web Recording to a Visual Studio Web Test

Fiddler is a required tool in the toolset of every web developer. It allows you to exam and work with HTTP requests. Although I’m using Fiddler for a long time, I still discover new features every day. As described here , one of the features I didn’t know is that you can export a Fiddler session as a Visual Studio Web Test (or Web Performance Test). So how do you achieve this? Open up the site you want to record and make a few requests. Go to File – Export Sessions – All Sessions. Select the Visual Studio WebTest export format. Save the file. Open a Visual Studio Test Project. Choose Add Existing Item and add the file you just saved.

OData improvements for Windows Phone 7

Last week the Windows Phone team announced the beta release of Mango . It contains a long list of exciting new features but one that I was especially interested in are the OData improvements. As we use OData for our TFS Monitor application , I’m happy to see how much easier my life will become. Kind of sad that I have to wait until the fall before I can add this functionality to my application Oh, before I forget, here is the list with the upcoming OData improvements: OData Client Library for Windows Phone Is Included in the Windows Phone SDK In this release, the OData client library for Windows Phone is included in the Windows Phone Developer SDK. It no longer requires a separate download. Add Service Reference Integration You can now generate the client data classes used to access an OData service simply by using the Add Service Reference tool in Visual Studio. For more information, see How to: Consume an OData Service for Windows Phone . Support for LINQ

WebActivator: How to control the order of execution?

One of the libraries we use during web development is WebActivator. WebActivator was introduced with NuGet to solve the following problem.  With the current NuGet version it is not possible to alter existing code. This means that if you have a NuGet package that requires some extra coding executed (for example in the global.asax), that you need to add this extra code yourself, breaking the nice experience that NuGet gives you. To solve this issue, David Ebbo introduced WebActivator. This library allows you to add startup code in separate files instead of having to add it to your global.asax. This overcomes the current limitation of NuGet. To use this library create an App_Start folder. Add a class file to this folder and include the following code: using System; [assembly: WebActivator.PreApplicationStartMethod(typeof(App_Start.MyApp), "PreStart")] namespace App_Start { public static class MySuperPackage { public static void PreStart() { // Add your start

Maybe it’s time to learn functional programming

After 5 years of object oriented thinking, I’m trying to wrap my head around the functional style of programming.  One of the things I had to read over and over before I started to understand them was the concept of Monads . Monads are everywhere in .NET now and it is a foundational concept in many of the really useful libraries and APIs that are coming out of Microsoft ( LINQ , Task Parallel Library (TPL) , Reactive Extensions (Rx) , etc.). One of the simplest implementations of Monads is the Maybe monad . To help you understand this I took the Maybe<T>   implementation from this post by Jordan Terrell . It works with all .NET types; value and reference types. Let’s have a look at the following simple sample: Maybe<string> text = Maybe.Value("Hello, World!"); if (text.HasValue) { Console.WriteLine(text.Value); } text = Maybe<string>.NoValue; At first this seems fairly similar to Nullable<T> with as only difference that it also works for refer