Visiting the WCF Data Services Toolkit April 2011 Update

Looking for a way to expose data from Windows Azure Table Storage via an OData webservice I quickly discovered that in its current form, WCF Data Services are primarily meant to expose relational data when using the Entity Framework. Since working with Azure Storage – such as TableStorage – typically does not involve a relational data model, OData seemed to be a difficult path. Not using Entity Framework means you will have to implement your own LINQ provider to get the job done. This is not a trivial task as it involves juggling – or rather messing – around with expression trees.

At this point I discovered the WCF Data Services Toolkit on Codeplex which circumvents the need to write your own LINQ provider by using a repository pattern architecture for you to hook into. I also found documentation to get me started, however I discovered later that the documentation does not match the current (beta) release of the toolkit.

As a prerequisite, please read the author’s blog post about updating resources with the toolkit. Here he describes the IWriteableRepository interface. Please note that this documentation is partly out of date as with the april update:


IWriteableRepository interface is gone

But fear not. It is a breaking change, but simply removing it from the class declaration will do the trick. This is because methods on your repository will be called by convention (hence by reflection). This has multiple advantages which the author mentions at the end of the updating resources post.

CreateDefaultEntity has been renamed to CreateResource

While the previous change might have been obvious, this one is not. But again, this is easy to solve, just rename the method.


This is it for the changes. Some random things I discovered while using the Toolkit:

Make sure to implement CreateResource. In there, simply new up a new instance of your entity (or do more advanced stuff if necessary). Otherwise, the toolkit will try to use Activator.CreateInstance() to create an entity instance, which did not work for me because the type to create an instance from was in a different assembly. This is a known issue (#12 on Codeplex) and I sent a patch for this to the author.


Just for reference: The list of methods one can implement in the repository:

  1. public void Save(<YourType> entity)
  2. public void Remove(<YourType> entity)
  3. public void CreateRelation(<YourType> entity, <YourOtherType> relatedEntity)
  4. public object CreateResource()

So far, I really like the toolkit and how it helped me to work with Azure Storage. Keep up the good work!


Azure Tools and Resources

Some time ago, someone requested sort of an overview of Azure features and resources. I thought this was a rather useful idea since I keep forgetting links to even the most commonly used Azure sites. This is not meant to be an introduction to Windows Azure or Cloud Computing in general but rather a opinionated collection of otherwise scattered resources.


Whitepapers are also some sort of resources, but I thought I’d give them some extra attention here instead of just adding them to the resources link list, otherwise no one including me would know what’s inside them.

Implementation Whitepapers

  • Programming Blob Storage Describes the Blob Storage hierarchy, how to work with blobs, use pagination and handle errors.
  • Programming Table Storage Learn the basics about Partition Keys, CRUD operations on Entities via LINQ, supported data types, Best Practices and Known Issues. This will also give you an idea about the difference between Relational Databases and the Table Storage Model.
  • Programming Queue Storage This is about connecting and sharing information between different application blocks in a loosely coupled way a.k.a. “the way it’s meant to be”. This is done by pushing messages (processing orders) into a queue and let some other part of the application deal with it.
  • SQL Azure
    The best way to get started with SQL Azure is probably to already have a basic understanding of SQL Server 2008. If you want to build your tables from scratch, you might want to know about the similarities and differences between SQL Server and SQL Azure. Due to the lack of designer support on SQL Azure (you can use SSMS to get access to your SQL Azure databases, but you can only manipulate table structures by scripts) I prefer to create and test tables on my local dev machine, then use the SQL Azure Migration Wizard to create these tables on my SQL Azure database.
  • Project Houston can do what SSMS doesn’t do for you – provide design capabilities for SQL Azure. This is implemented as Silverlight Client.
  • You know how Skype is able to communicate with other Skype instances running on different machines, all sitting behind firewalls? No? Well that’s exactly what AppFabric Service Bus does. Its primary goal is to bridge on-premise and off-premise solutions.


Azure Storage Explorer Have a look at and manipulate your Azure Tables, Blobs and Queues.

SQL Azure Migration Wizard Set up a local database and load it into your SQL Azure database. Or the other way round: Back up your SQL Azure to a local machine. Or to another SQL Azure instance if you feel like it.

Azure Services Management Tools Manage Azure Services such as .NET Access Control Services and the .NET Workflow Service. Please note that these tools are out of date as of 06.08.2010

Windows Azure Service Management CmdLets Use PowerShell to script your deployments, upgrades, and scaling of Windows Azure Applications


Videos, Frameworks?

As of 06.08.2010, the Windows Azure Storage Videos are offline but should have been replaced by 15.02.2010. Right. Maybe it’s a typo.

English Blogs

German blogs

Some other resources

[Update] I believe this to be a quite comprehensive list of Windows Azure resources

Azure Launch Day in Stuttgart Wrap Up

After having listened to Tim Fischer et al, the summary of this conference manifested in my mind like this:

Microsoft still hasn’t decided how to pronounce Azure. Which makes kind of sense since GB and US pronunciations seem to differ quite a bit.

Some other topics were of interest, too 😉 I’d like to give a brief summary here because the conference served as a trigger for me to revisit the latest evolution in Microsoft’s cloud computing.

Different VM Sizes

Azure now lets you choose how much power your hosting instances are sporting. There are four different sizes available:

Name Price / hour CPU RAM Instance Storage
Small $0.12 1 x 1.6 GHz 1.75 GB 250 GB
Medium $0.24 2 x 1.6 GHz 3.5 GB 500 GB
Large $0.48 4 x 1.6 GHz 7 GB 1,000 GB
X Large $0.96 8 x 1.6 GHz 14 GB 2,000 GB

Pricing and features is almost equal to Amazon Web Service Standard Instances. For details, see their different types and pricing.

Upcoming Features

There will be blobs that can be mounted as NTFS drives called XDrives. As Ray Ozzie said:

Perhaps most significantly is in a new storage type that we call XDrive. Azure XDrives are Azure storage blobs that are mountable as regular NTFS volumes, a drive mapped, cached, durable volume, accessible by mapping an Azure page blob into an NTFS VHD.

He also announced that the Azure portfolio will have a feature that is really more an IaaS feature than a PaaS feature:

As we move forward and think about ways we can simplify being able to take the investments that you’ve made in the Windows Server environment and move them into Windows Azure, one of the things that we’re doing is allowing you to create your own image. We will do this next year. This is another feature that’ll come in 2010. We’ll allow you to create your own image, which has all of the software configured exactly the way you want it.

As far as I know, this feature is called “Virtual Machine Role” but no one knows. Maybe even Microsoft doesn’t know. And if the do know, they won’t pronounce it. Hell no.

I also heard that in 2010 Worker Roles can be addressed directly from the web without having to route traffic through Web Roles. Didn’t quite understand why there are 2 different roles, then.


Blobs are really getting useful. I already knew they had the ability to be public or private, but these two new features were news to me:

  • By specifying HTTP Cache-Control policy for each blob, Azure Blob Storage can be used as Content Delivery Network
  • Snapshots of blobs can be taken to create read-only backups

Pricing Options

As we were told, there will be different pricing options. One of these options is useful for systems that already have a certain level of consumption and want a better pricing strategy for that compared with the very flexible but relatively costly “Pay as you grow” strategy. The first is more like Amazon AWS reserved instances.

And BizSpark Members will get Azure Hosting and SQL Azure for free for 8 months (don’t know details yet).



Compared to Amazon, I like the idea of PaaS (Azure being the first choice for a .NET developer like me). When I want give one of my ideas a try and build a web application, I surely don’t want to care about all this tedious infrastructure stuff like firewall, backups, load balancing, security updates etc.

It’s interesting to see that Microsoft is announcing a move more towards IaaS that early. This seems to be driven by early customer feedback. There must be a need for more flexible environments and they don’t want to lose those people to Amazon.

I really dig some of the new features. Good job so far, keep it coming. Looking forward to the EU datacenters.

More links:

Windows Azure Storage at PDC 2009

PDC 2009 (German Blog)

Azure Storage Manager

Recently I’ve been playing around with Windows Azure and wanted to get the log files for my hosted app.

I tried to get the logs using PowerShell and that worked in one case, on another box I got errors with PowerShell and couldn’t quite tell why. Anyway, I found it tedious to set up. What I wanted was a point + click solution to have all my logs on the hard drive. Another time I realized I created a lot of tables with Azure Table Storage and wanted to clean them up. I was missing a simple tool that would help me with these tasks. So I sat down and fired up Visual Studio.

I gave this app the humble name “Azure Storage Manager” as it can deal with tables as well, at a later point in time possibly even with queues.
Currently it can do this to tables:

  • List
  • Delete

ahem, that’s about it. Now for blobs:

  • List + show properties (well, some)
  • Delete
  • Copy to hard drive
  • All of the above also applies to whole blob containers. This is important because it allows you to get a container full of blobs with one click. Multi selection of blobs and containers is also supported.

It can store and use different account settings, which might come in handy if you happen to have different storage projects on Windows Azure.

This app is completely standalone in that it does NOT require PowerShell or Azure Development SDK installed. It worked for me under XP, Vista and Windows 7, which is hardly surprising as this is what the .NET runtime is for. Just wanted to make the point 🙂

To install / download the application, please head over to its ClickOnce installer site.

The following walkthrough shows how the application works.

When you start the application for the first time, it will complain that there are no settings stored. You will be presented with this screen where you can enter the information that has been given to you when you created your Azure Storage Project. You need to fill in your account name and shared key. The actual endpoints are being inferred from that information. After you press Save, your account settings will be persisted as XML files. To do so, you should specify where to put the files using Set Folder.

Set Settings Folder

In case you already have a folder with settings files in it, simply set the folder and it will show all the settings in the list on the left. Double-clicking on an item in the list will apply these settings to the application, just like pressing Use Setting does.

Settings loaded

After that initial setup,  you can switch to the Blob tab and you will find a listing of your blob containers and blobs. You will have to set a path in order to be able to copy blobs or containers. In this screenshot a container is selected, so any delete or copy operation will apply to the whole container includin all contained blobs. Copying on container level adds the container name to the path you specified; in this example it would create a folder C:\AzureBlobs\production to put all the listed blobs in.

Container scope

Here, we selected multiple blobs. As you can see, copy and delete now works on blob level. In this case, container name is excluded from save path.

Blob Scope

I might post the code for this app in a more technical follow-up, however I have to warn you that according to Phil Haack’s Sample Code Taxonomy, this is still prototype code. It works for me and does its job if you treat it well, but the amount of error handling that is not included in this code is tremendous 😉

What do you think about it?