Migrating Windows Service to Azure Worker Role: Image Conversion Example using Storage






In my work with Symon Communications we had to move the pieces of their solution from their current Windows Services implementations to something that would work well in the cloud.  These services would run in the background to collect, transform, and prepare data.  This seemed like a natural fit for the use of a worker role.  The simple scenario as a means of proving out the idea was to read the images from one container, convert them, and save them to another storage container.  Similar to the Thumbnails example that is in the Azure SDK, but in our case we wanted to simplify and felt the use of the queue to be overkill for what we needed to accomplish. 


The setup for this is to add a worker role to your cloud solution, create source and target containers in Azure Storage, and finally seed the source storage with the files, in our case PNG files, with the files that are to be converted.  This can all be done through development storage and fabric and works the same once deployed.  I’ll be using “pictures” and “converted” as the names of the two containers.  Thus, on the development storage they’ll actually be referenced as devstoreaccount1/pictures and devstoreaccount1/converted.  Let’s get started on the code by adding a new class file to the worker role project.  I named the class ImageConverter.  Keeping this as simple as possible for the point of demonstration of the worker role in place of a service I use System.Drawing.Image’s built in capabilities to transform it for me.  The code for the class is as follows:


using System;


using System.Xml.Serialization;


using System.Drawing; // for Image class


using System.Drawing.Imaging; // for ImageFormat class


using System.IO; // for FileStream class


 


namespace ConversionWorker


{


    public class ImageConverter


    {


        public byte[] ConvertImage(byte[] InBytes)


        {


            byte[] OutBytes = default(byte[]);


 


            // load the image from the file..


            MemoryStream InStream = new MemoryStream(InBytes);


            MemoryStream OutStream = new MemoryStream();


 


            //read input stream into Image


            Image imgInFile = Image.FromStream(InStream);


 


            //write stream out to new image type


            imgInFile.Save(OutStream, ImageFormat.Jpeg);


 


            //get the bytes from the stream


            OutBytes = OutStream.ToArray();


 


            //return the converted bytes


            return OutBytes;


        }


    }


}


 


As you can see, the above is a simple straight-forward, no-fuss implementation to do specifically one thing.  The next part of this is to modify the WorkerRole.cs file.  Open up the code file and let’s add in a little code to access Azure Storage and call our new class.  At the top of the file, be sure that these three using statements are present:


using Microsoft.WindowsAzure.ServiceRuntime;


using Microsoft.WindowsAzure.StorageClient;


using Microsoft.WindowsAzure;


 


I can’t remember which ones are there by default and which one(s) I added.  Keeping the example easy to follow I added one method to the class that does all of the work.  I named it ConvertFiles().  At the top of the function I simply setup the containers for access and open the connection as follows:


        private void ConvertFiles()


        {


            //storage fabric


            CloudStorageAccount StorageAccount = new CloudStorageAccount(new StorageCredentialsAccountAndKey("devstoreaccount1", "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="),


                                                                new Uri(@"http://127.0.0.1:10000/ "),


                                                                new Uri(@"http://127.0.0.1:10001/ "),


                                                                new Uri(@"http://127.0.0.1:10002/ "));


 


            CloudBlobClient BlobClient = StorageAccount.CreateCloudBlobClient();


 


     //get the right containers


            CloudBlobContainer BlobContainer = new CloudBlobContainer(StorageAccount.BlobEndpoint.ToString() + "devstoreaccount1/pictures", BlobClient);


            CloudBlobContainer ConvertedContainer = new CloudBlobContainer(StorageAccount.BlobEndpoint.ToString() + "devstoreaccount1/converted", BlobClient);


 


            BlobRequestOptions options = new BlobRequestOptions();


            options.AccessCondition = AccessCondition.None;


            options.BlobListingDetails = BlobListingDetails.All;


            options.UseFlatBlobListing = true;


            options.Timeout = new TimeSpan(0, 1, 0);


            


     //get the list of blobs in each container


            System.Collections.Generic.IEnumerable<IListBlobItem> SourceBlobs = BlobContainer.ListBlobs(options);


            System.Collections.Generic.IEnumerable<IListBlobItem> TargetBlobs = ConvertedContainer.ListBlobs(options);


 


In the next part of the code I did some name matching basically looking for the name to compare and decide if it existed in the list.  Since I was only looking for the root of the name and not the path or extension I needed to bookmark the ends.  Otherwise, I might get some false positives, for example, if the target container had the file MyImage.jpg and the source file that I was attempting to do an IfExists type function against was named Image.png then I would get a match if I only looked for the existence of Image in [path]/MyImage.jpg.  As such, I assume that there are NOT multiple periods (“.”) in the file name and I look for the root name bracketed by the “/” and the “.”.  So, I will be searching for “/Image.” and I would not get a match as the closest would be “/MyImage.”.  First, I setup a loop to go through the source files and extract the file name from the source.


            foreach (IListBlobItem item in SourceBlobs)


            {


                //get file name


                string[] SourceFileUriArray = item.Uri.ToString().Split(new char[] { '/' });


                string SourceFileName = SourceFileUriArray[SourceFileUriArray.Length - 1];


 


                //WARNING


                //this only works if the file name is [name].[ext] if there are multiple '.' in the filename it will fail


                SourceFileName = SourceFileName.Split(new char[] { '.' })[0];


 


                //check to see if in destination container


                IListBlobItem foundItem = null;


 


At this point is where the API for storage gets a little hokey:


                //check to see if in destination container


                IListBlobItem foundItem = null;


                try


                {


                    //This is a little hokey, but is simple and sufficient for the sample. 


                    //Without the "/" and "." there is no bound for the file name and it would return the same result


                    //   ../[path]/FancyIcon.jpg  and ../[path]/Fancy.jpg when looking for "Fancy"; so change to search for "/Fancy."


                    string NameToCompare = "/" + SourceFileName + ".";


                   


                    //note the catch block, if NO matching item is found it throws an InvalidOperationException


                    //we catch it and ensure the value of the item is null and use to indicate the need to convert


                    foundItem = TargetBlobs.First(listitem => listitem.Uri.ToString().IndexOf(NameToCompare) > 0);


                }


                catch (InvalidOperationException InvalidOpEx)


                {


                    Console.WriteLine("Element not found in target returning: " + InvalidOpEx.Message);


                    foundItem = null;  //just making sure in case the return behavior is to return default (IListBlobItem)


                }


 


Note the line right above the catch statement that is where the First<> method is called to find the element.  If no element is found an InvalidOperationException is thrown and we must catch it and continue on with the knowledge that we may convert the file as it doesn’t exist.  I’m not sure why it just doesn’t return null, but I can only work with what I’m given J


Last bit of work to do is to call the previously created convert function and then write it to the target location.


                //if it is not found it will be null and now we have to convert it and write to the target storage


                if (foundItem == null)


                {


                    Microsoft.WindowsAzure.StorageClient.CloudPageBlob SourceIconBlob = new CloudPageBlob(item.Uri.ToString());


                    ImageConverter ImageConverter = new ImageConverter();


 


                    byte[] SourceBytes = SourceIconBlob.DownloadByteArray();


 


                    //call the conversion function


                    byte[] ConvertedBytes = ImageConverter.ConvertImage(SourceBytes);


 


                    //write image to destination container


                    CloudBlob destBlob = ConvertedContainer.GetBlobReference(SourceFileName + ".jpg");


                    destBlob.UploadByteArray(ConvertedBytes);


                }


 


Finally, to get this thing running out there we need to make a couple more changes to what is generated from the project template.  First we need to ensure that the work routine gets called in the loop that is setup on a timer in the Run method of the Worker Role.


    public class WorkerRole : RoleEntryPoint


    {


        public override void Run()


        {


            // This is a sample worker implementation. Replace with your logic.


            Trace.WriteLine("ConversionWorker entry point called", "Information");


 


            while (true)


            {


                Thread.Sleep(10000);


                Trace.WriteLine("Working", "Information");


 


                //call function that performs the identification of files to be converted and then converts them


                ConvertFiles();


            }


        }


 


That is it; all you need to have a worker role spin up and convert images from one Azure Storage container and write them as JPGs to the target container.  Obviously, a bit of work could be done to make it convert from whatever file type you choose to another and, more importantly, you could implement a ThreadPool, or one of the other threading mechanisms, in order to get some parallelism in the execution of the routine.  However, for this simple sample, we are done.

Comments (3)

  1. Matt says:

    What about video conversion?  Currently, 3rd party applications are needed to get this done.  Any thoughts on a fully-integrated Azure solution?

  2. jofultz says:

    Sorry about the delay in response, I found tonight that I had several waiting, but no email informing me.

    I haven't been looking at video for a while.  I was interested for a bit and heard a rumor about a future SilverLight Video service or something to that effect in which case the service would likely take care of the up and down conversion; not sure about format conversion though.  In the months that have passed since you posted this have you found a 3rd party to use for video conversion?  I would love to know.

Skip to main content