Mahout with HDInsight

My name is Sudhir and I work with the Microsoft HDInsight support team. The other day, my colleague, Dan and I were discussing MAHOUT, so I thought about how it can be used with HDInsight.

[* Note:- If you are using HDInsight 3.1, it's have mahout package installed in it. So in such case one need not to follow some part of this blog post like uploading mahout jar file. Please have a look here to find out how to run mahout job in HDInsight 3.1.]

I investigated more to see how MAHOUT can be used with HDInsight and I feel its good information to share. First, I tried through master node (RDP) and then I tried with PowerShell. Running MAHOUT from the head node won’t be a recommended approach because if a cluster gets reimaged all the changes to the configuration will not be available. So I skipped this approach and started looking at PowerShell.    

Before I start I want to mention that MAHOUT is not supported by Microsoft.

In case you want to read more about MAHOUT click here.  I’ll be using RecommenderJob class for this example. More information about the class can be found here.

Here are the step by step instruction to use MAHOUT on HDInsight through PowerShell.

Copy following files to a folder on the Local Machine :- contains userid, itemid and value  :- contains userid

Next step is to upload above sample files. Open PowerShell window and use below script to upload each file.

$subscriptionName = "<subscription name>"

$storageAccountName = "<storage account>"

$containerName = "<container>"

$fileName ="<Location\FileName>"

# Uploading file under the folder mahout

$blobName = "<mahout/FileName>"  

# Get the storage account key

Select-AzureSubscription $subscriptionName

$storageaccountkey = get-azurestoragekey $storageAccountName | %{$_.Primary}

 # Create the storage context object

$destContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageaccountkey

 # Copy the file from local workstation to the Blob container       

Set-AzureStorageBlobContent -File $filename -Container $containerName -Blob $blobName -context $destContext

Copy below command (or get the script from here).

# Cluster Name


 # Subscription name


$containerName = "<containerName>"

$storageAccountName = "<StorageAccountName>"

Select-AzureSubscription -SubscriptionName $subscriptionName

 #   Assuming mahout-core-0.8-job.jar copied to Mahout folder.

 $mahoutJob = New-AzureHDInsightMapReduceJobDefinition -JarFile "wasb://$containerName@$" -ClassName ""

 # Adding the similarityclassname argument


 # Adding the name of similarityclassname. However other similarityclassname can be used.


 # Adding the input file argument


 # Adding location of the file. The file is stored on Windows Azure Storage Blob.


 # Adding usersFile as an argument.


 # Adding userFile location.


 # Adding output as an argument.


 # Adding output location. This will be the location where result will be generated.


 # Starting job

 $MahoutJobProcessing = Start-AzureHDInsightJob -Cluster $clusterName  -JobDefinition $mahoutJob

 # Waiting Job for completion

Wait-AzureHDInsightJob  -Job $MahoutJobProcessing -WaitTimeoutInSeconds 3600

 # Getting error if any

 Get-AzureHDInsightJobOutput -Cluster $clusterName -JobId $MahoutJobProcessing.JobId -StandardError


Run above scripts. Waiting job for completion

Once job done, the result will be found on target directory.

It basically outputs userIDs with associated recommended itemIDs and their scores.

 Clean up process:-

 Require to do rdp and run hdfs fs -rmr -skipTrash /user/hdp/temp to delete temp folder. Use any familiar tool to delete temp folder.  



  • Make sure of/understand algorithm which you are going to use.
  • Look for the input type by the algorithm.
  • You may want to prepare your data based on the input required by the algorithm.


Thanks to Bill and Sunil to review this blog post.


Happy Learning!

Sudhir Rawat


Comments (2)

  1. Rob Deary says:

    I was able to get mahout running by using the approach in this video:

    They remoted into the head node, installed mahout, and ran it from the node.  I like your approach better.

    Is that all you really need to do?  Upload the mahout jar to the cluster and execute it?  It all just magically works?

    Is there any reason you used version 0.8 of mahout, instead of 0.9?  

  2. sudhirblog says:

    Hi Rob,

    Thanks for the note. When I started working, mahout-core-0.8-job was available so I used that.

    I was busy and didn't get notice about 0.9. Your note bring attention and I try mahout-core-0.9-job for same scenario I described above and everything worked as expected.

    I am planning to explore more and will share my findings.

    Thanks and Regards,


Skip to main content