Load Data from the Azure DataMarket to Hadoop on Azure
Small Bites of Big Data
Cindy Gross, SQLCAT PM
UPDATED Jun 2013: HadoopOnAzure CTP has been replaced by HDInsight Preview which has a different interface and different functionality. Getting Started with Windows Azure HDInsight Service http://www.windowsazure.com/en-us/manage/services/hdinsight/get-started-hdinsight/
The first step in analyzing data is acquiring that data. You may have data available from logs, databases, or other internal sources. Or you may need to export data from an external source. In Hadoop on Azure we make it easy to load data from data sources such as the Azure DataMarket, Amazon S3, and the Windows Azure Blog Storage (ASV).
Preparing Hadoop On Azure for a data load
- Create a Hadoop cluster on http://HadoopOnAzure.com.
- Make sure you have available disk space on the Hadoop cluster equal to at least three times the size of the data you plan to import. This is because by default Hadoop has a replication factor of three, meaning it keeps three copies of each slice of data.
- Create an account on the Windows Azure Marketplace https://datamarket.azure.com/
Find your data source
- Find data on the Marketplace that meets your needs. For example, I looked for free demographic data.
- Some of the data marked as free has limits on how much is available without payment. I subscribed to the “Gender Info 2007 – United Nations Statistics Division” because it was free, had a significant amount of data, had unlimited transactions/month, and is easily understandable.
- You can click through on the data link for more information on the source, columns, and other details.
- When you click on the “use” button you get a screen that lets you query the data. You will use information from this screen to load the data into Hadoop.
- Leave this browser window open and stay logged in to the DataMarket.
Load the Data
- Log in to your cluster onHadoopOnAzure.com and click on the Manage Cluster tile.
- Click on the DataMarket tile
- Go to the DataMarket page for your data.
- For the gender data, you need to change the “Query” value under “BUILD QUERY” from “DataSeries” to “Values”.
- Next to “Primary Account Key” click on “Show”. This will expose a key value. Cut/paste that value into the “passkey” field of “Import from Data Market” on HadoopOnAzure.
- Copy the “URL for current expressed query” into the “Query” field on HadoopOnAzure. Erase the end of the query ($top=100) that limits the results to the first 100 rows. For the gender data the query is https://api.datamarket.azure.com/Data.ashx/UnitedNations/GenderInfo2007/Values?$top=100, change that to https://api.datamarket.azure.com/Data.ashx/UnitedNations/GenderInfo2007/Values in the query field.
- In HadoopOnAzure enter your DataMarket user name in the “User name” field and a unique new table name in the “Hive table name” field. If you the name of a table that already exists in this Hadoop cluster you will get an error.
- Click on the “Import Data” button to start the data load.
- You can now leave the screen and let the data load complete in the background.
- You can check on the status of the data load by clicking on the “Job History” tile.
- You can click through on the job to get more details.
- Once the data load is complete, you can query the table from the Interactive Console.
- Enter a SQL query in the white box at the bottom and then hit “Evaluate”. The HiveQL or HQL language is based on standard SQL and is closer to the MySQL implementation than to the SQL Server TSQL implementation. Most of the syntax is the same as TSQL but there are some differences. For example, execute: select * from GenderInfoValuesAll where year = 2000 order by countryname limit 200;. This looks like TSQL except you use limit instead of top.
Now your data is loaded and you’ve done a basic test to verify the data looks correct. You’re ready to start using Hive on Hadoop!
Other Microsoft Hadoop data loading links:
- How To: Upload Data and Use the WordCount Sample with Hadoop Services for Windows Azure (video) http://social.technet.microsoft.com/wiki/contents/articles/6985.how-to-upload-data-and-use-the-wordcount-sample-with-hadoop-services-for-windows-azure-video.aspx
- How to SFTP Data to Hadoop-based services on Windows Azure http://social.technet.microsoft.com/wiki/contents/articles/6779.how-to-sftp-data-to-hadoop-based-services-on-windows-azure.aspx
I hope you’ve enjoyed this small bite of big data! Look for more blog posts soon on the samples and other activities.
Note: the CTP and TAP programs are available for a limited time. Details of the usage and the availability of the CTP may change rapidly. Screenshots are from the May 2012 version of http://HadoopOnAzure.com.