Windows Azure 101 – My first Windows Azure app

I recently got a beta invite for the Windows Azure hosting and storage services and immediately set about writing a simple hello world hosted app. I wanted to test both the Web role and the Worker role so I was looking for a simple scenario for that. I liked the new Windows Live feature of displaying a new background image on site. They started doing this for the recently concluded Olympics but have since continued to display different images from around the world. I had written a simple windows service that checks the page every day to download any new images. I decided to port this app over to Windows Azure. I could use the Worker role to host my live image “poller” and store any new images in Windows Azure Blob storage. Then I can write a simple web app that will display all images stored in my blog storage.

Step 1: Download Windows Azure SDK and Visual Studio tools

Downloaded and installed Windows Azure SDK and Windows Azure tools for Visual studio.

Even if you don't have a Windows Azure invite, you can download the SDK and try it out by developing against the development fabric. The SDK contains all needed runtime components to host a local development fabric and storage services.

Step 2: Create a new Windows Azure Cloud Service

Launch VS 2008 and started a new Cloud Service project. From the predefined templates, I chose the Web and Worker Cloud Service template. This gave me a cloud service project with 2 roles in it (Web and Worker roles).

Click for larger image

Step 3: Code the live search image poller worker role.

The worker role is intended for background processing jobs. Worker roles cannot accept incoming requests but can make unlimited outgoing requests. The roles are run in a sandbox domain which means it doesn’t have any access to local file systems. All storage requirements are to be solved by using one of the three Windows Azure storage services (Blob, Queue and Table). For this sample all I needed to do was store the images in the blob service and add some metadata such that blobs can be rendered by a browser.

Before downloading all images, I needed to create a unique container where I would be storing these image blobs. The Windows Azure SDK ships a very useful REST based API to programmatically access the storage service. Here is the code to do that.


BlobStorage and BlobContainer types are part of the REST storage access API and RoleManager is Windows Azure logging utility. I set the container visibility to public for ease of access. (Don't bother trying to delete content as only GET requests are  anonymous, all other operations require my unique access key)

Primary job of the poller would be to download resource and look for the background image. It will then inspect the blob storage to see if the image already exists and if not add it to the container. We also set the blob’s content type to image/jpeg so browsers can render the blob link as an image.


Done. Its that simple. Now details about blobs in the live search container can be accessed via a simple get request to .Here is the list as of writing this post.


Step 4: Code the Web app to display the images.

I just wanted to have a simple ASPX page that just lists all images and their download timestamp. I once again used the REST API’s to download all blobs from my livesearchimages container and data bound the columns to a simple DataGrid.

Here is the code to retrieve all Blobs. To make data binding easier, I just wrapped the contents of my blob in to a LiveImageMetadata class.


LiveImageMetadata type.


And finally the DataGrid definition.


After testing locally, I updated all references to my cloud services url and published the service.




You can view all images downloaded by visiting link


That looks simple but it was not so straight forward experience moving it from my development fabric to the cloud fabric. I have some feedback to the Windows Azure team on how to make the developer experience simpler.

Some feedback:

  1. Whats up with the dependency on SQL Server Express edition? I had Sql Server on my dev box and had to go through manual steps (not straight forward and not documented) to get it to work with Sql Server.

  2. The app settings are in two different places when developing for the dev fabric and the cloud fabric. In dev mode, app settings go in web.config and app.config while for the production mode it has to entered in the service configuration file. It would be nice if the “Publish” button did that on behalf of the users.

  3. The ConfigurationSettings has to be defined in two parts. All keys needs to be defined in the ServiceDefinition.csdef file and the actual name value pairs defined in the serviceconfiguration.cscfg file. Seems redundant.

  4. Once deployed there is no UI indication of the initialization process. It would be good to have a UI (like the one they demonstrated in PDC that showed what state the VM machines are in)

  5. No online mechanism to view logs. you have to click “Copy logs” in the configure button then wait till logs are dumped to the blob container. Then you have to use the CloudDrive sample (the sample rocks btw) to copy the logs locally and then inspect them.

  6. No log filter. By default it logs all and hence the log XML files gets very large. Especially when every other second you have the “<EventProperty Name="Message">Entered GetHealthStatus()</EventProperty>” message spamming the logs.

  7. A TDS proxy to connect to the storage services so we can inspect our data via Sql management studio.

Having said that I see the huge potential Windows Azure has. Since the hosting services has .NET 3.5 installed it means you can host any (WCF/Silverlight/ASPX/ASMX) services on the cloud. I am going to move my Silverlight projects from over to Azure.


Maheshwar Jayaraman

Skip to main content