When you create a Web API service that needs to store large amount of unstructured data (pictures, videos, documents, etc.), one of the options you can consider is to use Windows Azure Blob Storage. It provides a fairly straightforward way of storing unstructured data in the cloud. In this post, I’ll show you how to create a simple file service using ASP.NET Web API backed by Azure Blob Storage.
Step 1: Install Azure SDK
First, you need to download and install the Azure SDK for .NET.
Next, you might want to create an Azure Storage Account (Don’t worry If you don’t have one yet. You can still try out the scenario locally using the Azure Storage Emulator that comes with the Azure SDK. Later when the app is ready to deploy, you can be easily switch it to use a real azure storage account).
Step 2: Create an ASP.NET Web API project
To start from scratch you can go to File/New/Project in Visual Studio and then select ”ASP.NET MVC 4 Web Application” with “Web API” as the project template.
* Note that you don’t need to create a “Windows Azure Cloud Service Project“ because you can deploy the ASP.NET Web API as a Web Site on Azure.
First, you need to set the connection string to connect to the Azure blobs. You can just add the following setting to your Web.config – it will use the the Azure Storage Emulator.
Now in order to interact with the Azure Blobs, you can use the CloudBlobClient provided by the Azure SDK. But first you need to add the reference to the following assemblies:
Next, you can create a helper like below to read the connection string from Web.config, new up a CloudBlobClient and return a container (CloudBlobContainer). The concept of containers is very similar to that of directories in a file system. In this case, the helper is going to create a directory/container called “webapicontainer” to store all the files. Beware that the name of the container cannot contain upper case characters. See this article to learn more about container naming.
The helper below is also giving everyone read access to the blobs in “webapicontainer” so that the files can be downloaded directly using the blob URI. Of course, you can set different permissions on the container depending on your scenarios.
Now, let’s use this helper in the Web API actions. Below, I’ve created a simple FilesController that will support the following actions:
- POST: Will upload files, this will only support multipart/form-data format
- GET: Will list the files that have been uploaded
Note that I created a custom MultipartFileStreamProvider to actually upload the multipart contents to Azure blobs.
* Here I only implemented two actions to keep the sample clear and simple, you can implement more actions such as Delete in a similar fashion – get a blob reference from the container and call Delete on it.
Step 3: Trying it out
First you need to start the Azure Storage Emulator. You can do it by going to the “Server Explorer”. Under the “Windows Azure Storage”, just right click and Refresh the “(Development)” node. It will start the Azure Storage Emulator.
Once the Azure storage emulator and the Web API service are up and running, you can start uploading files. Here I used fiddler to do that.
After the upload is complete, we can issue a GET request to the FilesController to get a list of files that have been uploaded. From the result below we can see there’s one file uploaded so far. And the file can be downloaded at Location: http://127.0.0.1:10000/devstoreaccount1/webapicontainer/samplePresentation.pptx.
Alternatively we can look through the Server Explorer to see that the file has in fact been upload to the blob storage.
Switching to a real Azure Blob Storage
When everything is ready to be deployed, you can simply update the connection string in Web.config to use the real Azure Storage account. Here is an example of what the connection string would look like:
The DefaultEndpointsProtocol=http will tell the CloudBlobClient to use the default http endpoint which is: http://[AccountName].blob.core.windows.net/ for the blobs.
The AccountName is simply the name of the storage account.
The AccountKey can be obtained from the Azure portal – just browse to the Storage section and click on “MANAGE KEYS”.
Increasing the maxRequestLength and maxAllowedContentLength
If you’re uploading large files, consider increasing the maxRequestLength setting in ASP.NET which is kept at 4MB by default. The following setting in Web.config should do the trick to increase it to 2GB.
If you’re on IIS, you might also want to increase the maxAllowedContentLength. Note that maxAllowedContentLength is in bytes whereas maxRequestLength is in kilobytes.
7/3/12: The code in this post can be downloaded from: http://code.msdn.microsoft.com/Uploading-large-files-386ec0af. It is using ASP.NET Web API nightly build packages.
8/29/12: I’ve updated the sample solution to use the released version (RTM) of ASP.NET Web API. I’ve also reduced the project size by removing the packages, so please make sure you enable NuGet Package Restore (Inside VS, go to Tools -> Options... -> Package Manager -> check "Allow NuGet to download missing packages during build" option).
Hope you find this helpful,