Kirk Evans Blog

.NET From a Markup Perspective

Securely Upload to Azure Storage with Angular

This post will show you how to securely upload blob content to Azure Storage from an Angular app. 

UploadToStorage

The source code for this solution is available at https://github.com/kaevans/globalscaledemo

Background

Our team has been busy the past few months traveling the globe and hosting readiness workshops for our top global system integrator partners.  One of the sessions that I wrote for the workshop, Architecting Global Scale Solutions, includes a demonstration of an application that operates on a global scale… you can deploy it to as many Azure regions as you want and it will scale horizontally across all of them.  One of the themes in the talk is about performance and the things that limit the ability to scale linearly as the amount of work increases.  The scenario is a web application that lets authenticated users upload a photograph.

Think about the problem for a moment and it will be evident.  If a user uploads a photo to our web server, then this can have drastic repercussions on our web application.  Our web application would need to read the HTTP request, de-serialize the byte stream, and then move that stream to some type of storage.  If we stream to the local disk, then we are potentially creating a bottleneck for IO on the local disk.  If we load the image into memory, then this will cause performance issues as the amount of work increases.  If we save to some external store, we likely first have to de-serialize into memory and then call to an external network resource… we now have a potentially unavailable service along with a network bottleneck.  Ideally, we would love to avoid dealing with the problem at all and enable users to upload directly to storage.  But how do we do this securely?

Our solution will authenticate our users with Azure AD.  Azure Storage does not provide a per-user authentication and does not integrate with Azure AD for authentication.  Rather than put our storage key in JavaScript (which they could easily obtain and disclose to non-authenticated users), we want to use an ad-hoc Shared Access Signature.  This allows us to provide a limited permission set to a limited set of resources for a limited amount of time.  In our case, we will only allow writing to a single blob container for 2 minutes.  In order to obtain that SAS URL, they have to be an authenticated user, which we will handle using Active Directory Authentication Library (ADAL) for JavaScript

Patterns and Practices

If you are working with distributed systems, you should be aware of the work that the Patterns and Practices team has done to identify common patterns and prescriptive architecture guidance for cloud applications.  This post will leverage two of those patterns: Federated Identity Pattern and Valet Key Pattern.  I highly suggest that you spend time reading through these patterns, if for no other reason than to become familiar with problems you weren’t aware existed and to help you design and build more reliable systems.

Authenticating Users

Rather than start from scratch, I started from an existing repository (https://github.com/Azure-Samples/active-directory-angularjs-singlepageapp) that shows how to Integrate Azure AD into an AngularJS single page app.  That repository has a complete walkthrough of how to create the single page app and how to register the app in Azure AD.  For brevity’s sake, refer to that repo to understand how to create an Angular app that authenticates to Azure AD.  I have also written an example, The API Economy: Consuming Our Web API from a Single Page App, which shows the value of using Azure AD to authenticate to a custom Web API as well as downstream services such as Office365. 

authenticate

In the active-directory-angularjs-singlepageapp repo you will see that you have to hard code the client ID into the app.js file.  My app is automatically deployed from GitHub, I don’t want to create a process that updates the app.js file.  Instead, I’d rather just update the appSettings for my Azure web app.  To enable this, I create a model class that contains the information needed in the app.js file for ADAL.js. 

ADALConfigResponse.cs
  1. using GlobalDemo.DAL;
  2. using System;
  3. using System.Collections.Generic;
  4. using System.Linq;
  5. using System.Web;
  6.  
  7. namespace GlobalDemo.Web.Models
  8. {
  9.     public class ADALConfigResponse
  10.     {
  11.         public string ClientId { get { return SettingsHelper.Audience; }  }
  12.         public string Tenant { get { return SettingsHelper.Tenant; } }
  13.         public string Instance { get { return "https://login.microsoftonline.com/"; } }
  14.     }
  15. }

Next, I created a Web API in my project that will read the appSettings and return the client ID.

ADALConfigController.cs
  1. using GlobalDemo.Web.Models;
  2. using System;
  3. using System.Collections.Generic;
  4. using System.Linq;
  5. using System.Net;
  6. using System.Net.Http;
  7. using System.Threading.Tasks;
  8. using System.Web.Http;
  9. using System.Web.Http.Description;
  10.  
  11. namespace GlobalDemo.Web.Controllers
  12. {
  13.     public class ADALConfigController : ApiController
  14.     {
  15.         [ResponseType(typeof(ADALConfigResponse))]
  16.         public IHttpActionResult Get()
  17.         {
  18.             return Ok(new ADALConfigResponse());
  19.         }
  20.     }
  21. }

I suck at Angular (and JavaScript in general), so I modified the adal-angular.js file, created a file “adal-angular-modified.js”, that will call the Web API endpoint (ADALConfigController.cs above) in my web application that returns the configuration information.  Only the relevant portions are shown here.

adal-angular-modified
  1. /*
  2.     Modified 11/24/2015 by @kaevans
  3.     Call a service to configure ADAL instead of
  4.     providing a hard coded value
  5. */
  6. if (configOptions.clientId) {
  7.     //Use the value provided by the user
  8. }
  9. else {
  10.     //HACK.  Call a service using jQuery.  Got sick of fighting
  11.     //angular injector.  
  12.     var myConfig = { instance: "", clientId: "", tenant: "" };
  13.     var resultText = $.ajax(
  14.         {
  15.             url: '/api/adalconfig',
  16.             dataType: JSON,
  17.             async: false
  18.         }).responseText;
  19.  
  20.     console.log(resultText);
  21.     var myConfig = $.parseJSON(resultText)
  22.     
  23.     configOptions.clientId = myConfig.ClientId;
  24.     configOptions.instance = myConfig.Instance;
  25.     configOptions.tenant = myConfig.Tenant;
  26. }
  27. /* End of modification */

There’s probably a much cleaner way to do this, a more “Angular-y” way, but this worked.  If anyone has a better approach, please submit a pull request!  Now I don’t have to embed the configuration details in the JavaScript file itself, my Web API will provide those for me.

app.js
  1. 'use strict';
  2. angular.module('todoApp', ['ngRoute','AdalAngular','azureBlobUpload'])
  3. .config(['$routeProvider', '$httpProvider', 'adalAuthenticationServiceProvider', function ($routeProvider, $httpProvider, adalProvider) {
  4.  
  5.     $routeProvider.when("/Home", {
  6.         controller: "homeCtrl",
  7.         templateUrl: "/App/Views/Home.html",
  8.     }).when("/MyPhotos", {
  9.         controller: "myPhotosCtrl",
  10.         templateUrl: "/App/Views/MyPhotos.html",
  11.         requireADLogin: true,
  12.     }).when("/Upload", {
  13.         controller: "uploadCtrl",
  14.         templateUrl: "/App/Views/Upload.html",
  15.         requireADLogin: true,
  16.     }).when("/UserData", {
  17.         controller: "userDataCtrl",
  18.         templateUrl: "/App/Views/UserData.html",
  19.     }).otherwise({ redirectTo: "/Home" });
  20.  
  21.     adalProvider.init(
  22.         {
  23.             instance: '',
  24.             tenant: '',
  25.             clientId: '',
  26.             extraQueryParameter: 'nux=1',
  27.             cacheLocation: 'localStorage', // enable this for IE, as sessionStorage does not work for localhost.
  28.         },
  29.         $httpProvider
  30.         );
  31.    
  32. }]);

The user is now required to authenticate with Azure AD when accessing Upload.html or MyPhotos.html.  This will obtain an OAuth token using the implicit grant flow (make sure you follow the directions in the active-directory-angularjs-singlepageapp repo to update the manifest and enable this).  This is an example of the Federated Identity Pattern, it removes the authentication and administration functions for your app and enables your app to focus on its core functions.

Setting Up Storage and CORS

In order for our application to work, we need to make sure that the storage container exists and the storage account enables CORS (which is disabled by default).  To do this, I created a class, StorageConfig.cs, that is called when the application is started.  You might want to provide a  more restricted CORS configuration.  Our app doesn’t need all of the CORS methods, so we should probably remove a few (like GET, HEAD, and PUT).  Further, we could restrict the allowed origins as well. 

Setting up Storage
  1. using Microsoft.WindowsAzure.Storage;
  2. using Microsoft.WindowsAzure.Storage.Blob;
  3. using Microsoft.WindowsAzure.Storage.Shared.Protocol;
  4. using System.Collections.Generic;
  5. using System.Threading.Tasks;
  6.  
  7. namespace GlobalDemo.Web
  8. {
  9.     public static class StorageConfig
  10.     {
  11.         /// <summary>
  12.         /// Configures the storage account used by the application.
  13.         /// Configures to support CORS, and creates the blob, table,
  14.         /// and queue needed for the app if they don't already exist.
  15.         /// </summary>
  16.         /// <param name="localStorageConnectionString">The storage account connection string</param>
  17.         /// <returns></returns>
  18.         public static async Task Configure(string storageConnectionString)
  19.         {
  20.             
  21.             var account = CloudStorageAccount.Parse(storageConnectionString);
  22.             var client = account.CreateCloudBlobClient();
  23.             var serviceProperties = client.GetServiceProperties();
  24.  
  25.             //Configure CORS
  26.             serviceProperties.Cors = new CorsProperties();
  27.             serviceProperties.Cors.CorsRules.Add(new CorsRule()
  28.             {
  29.                 AllowedHeaders = new List<string>() { "*" },
  30.                 AllowedMethods = CorsHttpMethods.Put | CorsHttpMethods.Get | CorsHttpMethods.Head | CorsHttpMethods.Post,
  31.                 AllowedOrigins = new List<string>() { "*" },
  32.                 ExposedHeaders = new List<string>() { "*" },
  33.                 MaxAgeInSeconds = 3600 // 60 minutes
  34.             });
  35.  
  36.             await client.SetServicePropertiesAsync(serviceProperties);
  37.             
  38.             //Create the public container if it doesn't exist as publicly readable
  39.             var container = client.GetContainerReference(GlobalDemo.DAL.Azure.StorageConfig.PhotosBlobContainerName);
  40.             await container.CreateIfNotExistsAsync(BlobContainerPublicAccessType.Container, new BlobRequestOptions(), new OperationContext { LogLevel = LogLevel.Informational });
  41.  
  42.             //Create the thumbnail container if it doesn't exist as publicly readable
  43.             container = client.GetContainerReference(GlobalDemo.DAL.Azure.StorageConfig.ThumbnailsBlobContainerName);
  44.             await container.CreateIfNotExistsAsync(BlobContainerPublicAccessType.Container, new BlobRequestOptions(), new OperationContext { LogLevel = LogLevel.Informational });
  45.  
  46.             //Create the private user uploads container if it doesn't exist
  47.             container = client.GetContainerReference(GlobalDemo.DAL.Azure.StorageConfig.UserUploadBlobContainerName);
  48.             await container.CreateIfNotExistsAsync();
  49.  
  50.             //Create the "uploadqueue" queue if it doesn't exist             
  51.             var queueClient = account.CreateCloudQueueClient();
  52.             var queue = queueClient.GetQueueReference(GlobalDemo.DAL.Azure.StorageConfig.QueueName);
  53.             await queue.CreateIfNotExistsAsync();
  54.  
  55.             //Create the "photos" table if it doesn't exist
  56.             var tableClient = account.CreateCloudTableClient();
  57.             var table = tableClient.GetTableReference(GlobalDemo.DAL.Azure.StorageConfig.TableName);
  58.             await table.CreateIfNotExistsAsync();
  59.         }
  60.     }
  61. }

Something to point out… our client will upload to the container named “uploads” (line 47 above).  The other two containers are marked as public access, but this one is not.  It is only available by going through our server-side application first.

Obtaining a SAS URL

The JavaScript client needs to obtain a SAS URL.  We only want authenticated users to have the ability to upload.  We will use the Valet Key Pattern to obtain a SAS URL and return it to clients via an authenticated Web API.

GetSASUrl

We do that through a Web API controller, UploadController.cs.

UploadController.cs
  1. using GlobalDemo.DAL;
  2. using GlobalDemo.DAL.Azure;
  3. using GlobalDemo.Web.Models;
  4. using Microsoft.WindowsAzure.Storage;
  5. using System;
  6. using System.Configuration;
  7. using System.Security.Claims;
  8. using System.Text.RegularExpressions;
  9. using System.Threading.Tasks;
  10. using System.Web.Http;
  11. using System.Web.Http.Description;
  12.  
  13. namespace GlobalDemo.Web.Controllers
  14. {
  15.     [Authorize]    
  16.     public class UploadController : ApiController
  17.     {
  18.         /// <summary>
  19.         /// Gets a SAS token to add files to blob storage.
  20.         /// The SAS token is good for 2 minutes.
  21.         /// </summary>
  22.           /// <returns>String for the SAS token</returns>
  23.         [ResponseType(typeof(StorageResponse))]
  24.         [HttpGet]
  25.         [Route("api/upload/{extension}")]
  26.         public IHttpActionResult Get(string extension)
  27.         {
  28.             Regex rg = new Regex(@"^[a-zA-Z0-9]{1,3}$");
  29.             if(!rg.IsMatch(extension))
  30.             {
  31.                 throw new HttpResponseException(System.Net.HttpStatusCode.BadRequest);
  32.             }
  33.  
  34.             string connectionString = SettingsHelper.LocalStorageConnectionString;
  35.             var account = CloudStorageAccount.Parse(connectionString);
  36.             StorageRepository repo = new StorageRepository(account);
  37.  
  38.             //Get the SAS token for the container.  Allow writes for 2 minutes
  39.             var sasToken = repo.GetBlobContainerSASToken();
  40.  
  41.             //Get the blob so we can get the full path including container name
  42.             var id = Guid.NewGuid().ToString();
  43.             var newFileName = id + "." + extension;
  44.  
  45.             string blobURL = repo.GetBlobURI(
  46.                 newFileName,
  47.                 DAL.Azure.StorageConfig.UserUploadBlobContainerName).ToString();
  48.  
  49.  
  50.             //This function determines which storage account the blob will be
  51.             //uploaded to, enabling the future possibility of sharding across
  52.             //multiple storage accounts.
  53.             var client = account.CreateCloudBlobClient();
  54.  
  55.             var response = new StorageResponse
  56.             {
  57.                 ID = id,
  58.                 StorageAccountName = client.BaseUri.Authority.Split('.')[0],
  59.                 BlobURL = blobURL,                
  60.                 BlobSASToken = sasToken,
  61.                 ServerFileName = newFileName
  62.             };
  63.  
  64.             return Ok(response);
  65.         }

Rather than put all of the logic into the controller itself, I created a repository class as a dumping ground for all operations related to storage.

GetBlobContainerSASToken
  1. /// <summary>
  2.         /// Gets a blob container's SAS token
  3.         /// </summary>
  4.         /// <param name="containerName">The container name</param>
  5.         /// <param name="permissions">The permissions</param>
  6.         /// <param name="minutes">Number of minutes the permissions are effective</param>
  7.         /// <returns>System.String – The SAS token</returns>
  8.         public string GetBlobContainerSASToken(            
  9.             string containerName,
  10.             SharedAccessBlobPermissions permissions,
  11.             int minutes)
  12.         {
  13.  
  14.             var client = _account.CreateCloudBlobClient();
  15.  
  16.             var policy = new SharedAccessBlobPolicy();
  17.  
  18.             policy.Permissions = permissions;
  19.             policy.SharedAccessStartTime = System.DateTime.UtcNow.AddMinutes(-10);
  20.             policy.SharedAccessExpiryTime = System.DateTime.UtcNow.AddMinutes(10);
  21.  
  22.             var container = client.GetContainerReference(containerName);
  23.  
  24.             //Get the SAS token for the container.
  25.             var sasToken = container.GetSharedAccessSignature(policy);
  26.  
  27.             return sasToken;
  28.         }
  29.  
  30.         /// <summary>
  31.         /// Gets the blob container's SAS token without any parameters.
  32.         /// Defaults are Write permissions for 2 minutes
  33.         /// </summary>
  34.         /// <returns>System.String – the SAS token</returns>
  35.         public string GetBlobContainerSASToken()
  36.         {
  37.             return GetBlobContainerSASToken(                
  38.                 DAL.Azure.StorageConfig.UserUploadBlobContainerName,
  39.                 SharedAccessBlobPermissions.Write,
  40.                 2);
  41.         }

Notice that line 15 has the Authorize attribute applied to the controller.  This protects that only authenticated users can access this API, so you must have an OAuth token that the app trusts in order to obtain the SAS URL.  That OAuth token is handled in the Starup.Auth.cs library which sets up Bearer authentication.

Bearer Authentication
  1. using System;
  2. using System.Threading.Tasks;
  3. using Microsoft.Owin;
  4. using Owin;
  5. using Microsoft.Owin.Security.ActiveDirectory;
  6. using System.Configuration;
  7. using GlobalDemo.DAL;
  8.  
  9. namespace GlobalDemo.Web
  10. {
  11.     public partial class Startup
  12.     {
  13.         public void ConfigureAuth(IAppBuilder app)
  14.         {
  15.             app.UseWindowsAzureActiveDirectoryBearerAuthentication(
  16.                 new WindowsAzureActiveDirectoryBearerAuthenticationOptions
  17.                 {                    
  18.                     Tenant = SettingsHelper.Tenant,
  19.                     TokenValidationParameters = new System.IdentityModel.Tokens.TokenValidationParameters
  20.                     {
  21.                         ValidAudience = SettingsHelper.Audience
  22.                     }
  23.                     
  24.                 });
  25.         }
  26.  
  27.     }
  28. }

The angular client can now call the service to obtain the SAS URL.

uploadSvc.js
  1. 'use strict';
  2. angular.module('todoApp')
  3. .factory('uploadSvc', ['$http', function ($http) {
  4.     return {
  5.         getSASToken: function (extension) {
  6.             return $http.get('/api/Upload/' + extension);
  7.         },
  8.         postItem: function (item) {
  9.             return $http.post('/api/Upload/', item);
  10.         }
  11.     };
  12. }]);

This is the Valet Key Pattern, where the client doesn’t actually get the storage key, but rather a short-lived SAS URL with limited permission to access a resource.

The Angular Client

The client references a series of scripts, including app.js and uploadsvc.js that we’ve shown previously.

JavaScript references
  1. <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
  2.         <script src="https://ajax.googleapis.com/ajax/libs/angularjs/1.2.25/angular.min.js"></script>
  3.         <script src="https://code.angularjs.org/1.2.25/angular-route.js"></script>
  4.         <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.2.0/js/bootstrap.min.js"></script>
  5.         <script src="https://secure.aadcdn.microsoftonline-p.com/lib/1.0.7/js/adal.min.js"></script>
  6.         
  7.         <!– Modified version of adal-angular –>
  8.         <script src="App/Scripts/adal-angular-modified.js"></script>
  9.  
  10.         <script src="App/Scripts/app.js"></script>
  11.         <script src="App/Scripts/azure-blob-upload.js"></script>
  12.         <script src="App/Scripts/homeCtrl.js"></script>
  13.         <script src="App/Scripts/homeSvc.js"></script>
  14.         <script src="App/Scripts/userDataCtrl.js"></script>
  15.         <script src="App/Scripts/myPhotosCtrl.js"></script>
  16.                 <script src="App/Scripts/myPhotosSvc.js"></script>
  17.         <script src="App/Scripts/uploadCtrl.js"></script>
  18.         <script src="App/Scripts/uploadSvc.js"></script>

There is one to call out, azure-blob-upload.js (line 11).  That is an Angular library by Stephen Brannan that wraps the Azure storage blob upload operations into an Angular module.  We create an Angular controller that calls the upload service to obtain the SAS URL, then it calls the azureBlob module to actually upload to Azure Storage.

uploadCtrl.js
  1. 'use strict';
  2. angular.module('todoApp')
  3. .controller('uploadCtrl', ['$scope', '$location', 'azureBlob', 'uploadSvc', 'adalAuthenticationService', function ($scope, $location, azureBlob, uploadSvc, adalService) {
  4.     $scope.error = "";
  5.     $scope.sasToken = "";
  6.     $scope.config = null;
  7.     $scope.uploadComplete = false;
  8.     $scope.progress = 0;
  9.     
  10.     $scope.cancellationToken = null;
  11.     
  12.     $scope.fileChanged = function () {
  13.         console.log($scope.file.name);
  14.     };
  15.  
  16.     $scope.upload = function () {
  17.  
  18.         var myFileTemp = document.getElementById("myFile");
  19.         console.log("File name: " + myFileTemp.files[0].name);
  20.         var extension = myFileTemp.files[0].name.split('.').pop();
  21.  
  22.         uploadSvc.getSASToken(extension).success(function (results) {
  23.                       
  24.             console.log("SASToken: " + results.BlobSASToken);
  25.             
  26.             $scope.config =
  27.             {
  28.                 baseUrl: results.BlobURL,
  29.                 sasToken: results.BlobSASToken,
  30.                 file: myFileTemp.files[0],
  31.                 blockSize: 1024 * 32,
  32.  
  33.                 progress: function (amount) {                    
  34.                     console.log("Progress – " + amount);
  35.                     $scope.progress = amount;
  36.                     console.log(amount);
  37.                 },
  38.                 complete: function () {
  39.                     console.log("Completed!");
  40.                     $scope.progress = 99.99;
  41.                     uploadSvc.postItem(
  42.                         {
  43.                             'ID' : results.ID,
  44.                             'ServerFileName': results.ServerFileName,
  45.                             'StorageAccountName': results.StorageAccountName,
  46.                             'BlobURL': results.BlobURL
  47.                         }).success(function () {
  48.                         $scope.uploadComplete = true;
  49.                     }).error(function (err) {
  50.                         console.log("Error – " + err);
  51.                         $scope.error = err;
  52.                         $scope.uploadComplete = false;
  53.                     });                    
  54.                 },
  55.                 error: function (data, status, err, config) {
  56.                     console.log("Error – " + data);
  57.                     $scope.error = data;
  58.                 }
  59.             };
  60.             azureBlob.upload($scope.config);
  61.  
  62.         }).error(function (err) {            
  63.             $scope.error = err;
  64.         });
  65.     };
  66.  
  67. }])

The upload is performed on line 60.  Inside the azureBlob module, it uses the putBlock and putBlockList APIs for Azure Storage to upload a block blob.  We can do that from a JavaScript client because we’ve enabled CORS.

putblock

When the upload is complete, the Angular client sends a message back to the Web API to indicate the upload is complete and is available for further processing (line 41).  That Web API looks like this:

CompleteRequest
  1. /// <summary>
  2.         /// Notify the backend that a new file was uploaded
  3.         /// by sending a queue message.
  4.         /// </summary>
  5.         /// <param name="value">The name of the blob to be processed</param>
  6.         /// <returns>Void</returns>
  7.         public async Task Post(CompleteRequest item)
  8.         {
  9.             string owner = ClaimsPrincipal.Current.FindFirst(ClaimTypes.NameIdentifier).Value;
  10.             //Get the owner name field
  11.             string ownerName = ClaimsPrincipal.Current.FindFirst("name").Value;
  12.             //Replace any commas with periods
  13.             ownerName = ownerName.Replace(',', '.');
  14.  
  15.             string message = string.Format("{0},{1},{2},{3}, {4}, {5}", item.ID, item.ServerFileName, item.StorageAccountName, owner, ownerName, item.BlobURL);
  16.  
  17.             //Send a queue message to each storage account registered
  18.             //in AppSettings prefixed with "Storage"
  19.             
  20.  
  21.             foreach (string key in ConfigurationManager.AppSettings.Keys)
  22.             {
  23.                 if(key.ToLower().StartsWith("storage"))
  24.                 {
  25.                     //This is a storage configuration  
  26.                     var repo = new StorageRepository(ConfigurationManager.AppSettings[key]);
  27.                     await repo.SendQueueMessageAsync(message);                    
  28.                 }
  29.             }
  30.             
  31.             
  32.         }

All this code is doing is sending a queue message to every storage account in appSettings, which allows us to send a message to every region around the world if we so desire.

Update Your Code

I know quite a few developers have been taking advantage of ADAL.js for some time now.  It bears explaining that this scenario didn’t work until very recently.  There was a recent change in ADAL.js that added a check to see if the call was to the app backend or not.  Prior to this fix, a call to Azure Storage from the Angular client would have failed because ADAL.js would have sent the OAuth token to storage and it would have failed (because Azure Storage doesn’t understand the OAuth token).  If you have existing ADAL.js investments and want to add the ability to call additional services, this requires that you update your ADAL.js reference to at least 1.0.7 in order for this scenario to work.

Download the Code

You can obtain the complete solution that includes the code accompanying this post at https://github.com/kaevans/globalscaledemo

For More Information

https://github.com/kaevans/globalscaledemo – The code accompanying this post

Cloud Design Patterns: Prescriptive Architecture Guidance for Cloud Applications

Federated Identity Pattern

Valet Key Pattern

Shared Access Signatures: Understanding the SAS Model

Integrate Azure AD into an AngularJS single page app

ADAL.js

Angular Azure Blob Upload