AzureCopy to the Rescue for an S3 to Azure Blob Copy!

This week I helped a client move files from AWS S3 to Azure Storage blobs. Sounds simple, right? Here's the tricky part... While there are both Azure and AWS cmdlets for PowerShell, they don't cooperate. Neither has a cmdlet that accepts credentials from the other and neither accepts arbitrary URLs from outside their own cloud. And AzCopy also doesn't accept S3 URLs. None of the S3 tools seem to recognize Azure. So what's a girl to do?

The Search and The Discovery

After hours of trying to get creative with PowerShell or AzCopy I resorted to Bing searches. When what to my wondering eyes should appear, but a miniature sleigh.... uh, a fully fledged, well-written tool to move data between Azure and S3. But there's more! This tool, known as Rudolph... I mean AzureCopy, can move data between Azure, S3, OneDrive, SharePoint online, Dropbox, and local file systems! Ken Faulkner has written a wonderful, holly jolly tool! After a few hiccups as I learned how to use the tool and learned about how S3 URLs are (and at first mostly are not) formed I quickly had all my data moved from S3 to Azure! Simple. Easy. It flew like the down of a thistle (whatever that means). So, what was required after installing the tool?

Open a dos-prompt and go to the directory where you installed AzureCopy. Instead of using a config file I set the values at the command line (use your own real values for the directory and after each equal sign):

cd C:\installs\azurecopy\
set AzureAccountKey=MyAzureStorageAccountKey
set AWSAccessKeyID=MyS3AccessId
set AWSSecretAccessKeyID=MyS3SecretKey
set AWSRegion value=us-west-2

Then I got a listing of my files on S3 - this took longer than it should because I had trouble getting the S3 URL correct. That was a problem with my newness to S3, not a problem with the tool. If you're in the default region you use mybucket.s3.amazonaws.com. Otherwise you use mybucket.s3-region.amazonaws.com. See Amazon's docs on S3 buckets for more details on the URL.

Also, I didn't need all the keys passed in on both commands, it was just easier to write and copy the code that way as I tried to get it all working.

azurecopy -list https://mybucket.s3-us-west-2.amazonaws.com/ -azurekey %AzureAccountKey% -s3k %AWSAccessKeyID% -s3sk %AWSSecretAccessKeyID%

Next I listed out the files in Azure. At this point the container was empty but the command at least verified my access worked. I uploaded a small test file and verified I could see it with AzureCopy, then deleted the test file.

azurecopy -list https://mystorage.blob.core.windows.net/mycontainer  -azurekey %AzureAccountKey% -s3k %AWSAccessKeyID% -s3sk %AWSSecretAccessKeyID%

And now on to the secret sauce - the actual, magical file copy.

azurecopy -i https://mybucket.s3-us-west-2.amazonaws.com/ -o https://mystorage.blob.core.windows.net/mycontainer -azurekey %AzureAccountKey% -s3k %AWSAccessKeyID% -s3sk %AWSSecretAccessKeyID% -blobcopy -destblobtype block

Success!

And just like that, within a couple of minutes, the list command for azurecopy showed all the files in Azure! I double-checked with my Azure and AWS PowerShell cmdlets that yes, this was really true! This tool saved me SO MUCH TIME! And now you know, the built in tools from the major cloud vendors lock you into their own cloud. But with AzureCopy you too can free your data!

Technorati Tags: AWS,S3,Azure Blob Store,Azure Storage,Big Data,PowerShell,Neal Analytics