How to export your Azure DevTest Labs usage data


Acknowledgement: This post is contributed by Roger Best, Senior Software Engineer in Customer Success Team. He’s been working on a solution to help our customers to adopt Azure DevTest Labs.

DevTest Labs recently released a feature that allows lab administrators to programmatically pull usage data from the lab(s) into a secondary Azure Storage account. Then those data can be manipulated and visualized using other tools, like SQL Server, Data Lake, PowerBI, etc. If you want to report your labs’ usage externally, e.g. to your management team who are not necessarily using your Azure subscription, this is a very neat feature for you. The charts below show an example what I can get from my queries using those exported data as an end results:

Sample usage chart - average VM lifespan

Figure 1: Sample usage chart - average VM lifespan

Sample usage chart - VMs per month

Figure 2: Sample usage chart - VMs per month

In this post, I will explain how this is done, so that you can start visualizing your labs’ usage data as well with this new capability.

The data is exported as two different CSV files, a disks.csv which contains information about the disks being used by the different VMs, and a virtualmachines.csv which contains information about the virtual machines in the lab.  Before getting into details about the data let’s take a look at a sample PowerShell script to retrieve the data.

Param (
[Parameter (Mandatory=$true, HelpMessage="The storage account name where to store usage data")]
[string] $storageAccountName,
[Parameter (Mandatory=$true, HelpMessage="The storage account key")]
[string] $storageKey,
[Parameter (Mandatory=$true, HelpMessage="The DevTest Lab name to get usage data from")]
[string] $labName,
[Parameter (Mandatory=$true, HelpMessage="The DevTest Lab subscription")]
[string] $labSubscription
)

#Login
Login-AzureRmAccount

# Set the subscription for the lab
Get-AzureRmSubscription -SubscriptionId $labSubscription  | Select-AzureRmSubscription

# DTL will create this container in the storage when invoking the action, cannot be changed currently
$containerName = "labresourceusage"

# Get the storage context
$Ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageKey
$SasToken = New-AzureStorageAccountSASToken -Service Blob, File -ResourceType Container, Service, Object -Permission rwdlacup -Protocol HttpsOnly -Context $Ctx

# Generate the storage blob uri
$blobUri = $Ctx.BlobEndPoint + $SasToken

# blobStorageAbsoluteSasUri and usageStartDate are required
$actionParameters = @{
'blobStorageAbsoluteSasUri' = $blobUri
}

$startdate = (Get-Date).AddDays(-7)
$actionParameters.Add('usageStartDate', $startdate.Date.ToString())

# Get the lab resource group
$resourceGroupName = (Find-AzureRmResource -ResourceType 'Microsoft.DevTestLab/labs' | Where-Object { $_.Name -eq $labName}).ResourceGroupName

# Create the lab resource id
$resourceId = "/subscriptions/" + $labSubscription + "/resourceGroups/" + $resourceGroupName + "/providers/Microsoft.DevTestLab/labs/" + $labName + "/"

# !!!!!!! this is the new resource action to get the usage data.
$result = Invoke-AzureRmResourceAction -Action 'exportResourceUsage' -ResourceId $resourceId -Parameters $actionParameters -Force

# Finish up cleanly
if ($result.Status -eq "Succeeded") {
Write-Output "Telemetry successfully downloaded for " $labName
return 0
}

else
{
Write-Output "Failed to download lab: " + $labName
Write-Error $result.toString()
return -1
}

The key components in the above samples are:

Invoke-AzureRmResourceAction -Action 'exportResourceUsage' -ResourceId $resourceId -Parameters $actionParameters -Force

And the two action parameters:

  • blobStorageAbsoluteSasUri
    • This is the Storage Account URI with SAS token, in the PowerShell script this could be passed in instead of the storage key.
  • usageStartDate
    • This sets the beginning date to pull data, the end date being the current date that the action is executed. The granularity is at the day level, so even if you add time information it will be ignored.

Exported data -  a closer look

Now let’s take a closer look at the exported data. As I mentioned earlier, once the data are successfully exported, there will be two CSVs files. The virtualmachines.csv contains the following data columns:

  • SubscriptionId
    • The subscription identifier that the DevTest lab exists in.
  • LabUId
    • Unique DevTest lab GUID identifier.
  • LabName
    • DevTest Lab Name.
  • LabResourceId
    • Fully qualified Lab Resource.
  • ResourceGroupName
    • Resource group name that the Virtual machine exists in.
  • ResourceId
    • Fully qualified virtual machine resource.
  • ResourceUId
    • Unique virtual machine GUID identifier.
  • Name
    • Virtual machine name.
  • CreatedTime
    • Date time VM was created.
  • DeletedDate
    • Date time VM was deleted.
    • If empty deletion has not occurred, yet.
  • ResourceOwner
    • Owner of the VM.
    • If empty then either a Claimable VM or created by a Service Principal.
  • PricingTier
    • Virtual machine pricing tier.
  • ResourceStatus
    • Availability state.
    • Either Active if still exists or Inactive if the VM has been deleted.
  • ComputeResourceId
    • Fully qualified virtual machine compute resource identifier.
  • Claimable
    • Claimable machine when True
  • EnvironmentId
    • The environment resource identifier within which the Virtual machine was created in.
    • Empty when the virtual machine was not created as part of an environment resource
  • ExpirationDate
    • Expiration date for the VM.
    • Empty if an expiration date has not been set.
  • GalleryImageReferenceVersion
    • VM base image information version.
  • GalleryImageReferenceOffer
    • VM base image information reference offer.
  • GalleryImageReferencePublisher
    • VM base image information publisher.
  • GalleryImageReferenceSku
    • VM base image information Sku.
  • GalleryImageReferenceOsType
    • VM base image information O.S. type
  • CustomImageId
    • Fully qualified VM base custom image.

The data columns contained in disks.csv are listed below:

  • SubscriptionId
    • The subscription identifier that the DevTest lab exists in.
  • LabUId
    • Unique DevTest lab GUID identifier.
  • LabName
    • DevTest Lab Name
  • LabResourceId
    • Fully qualified Lab Resource.
  • ResourceGroupName
    • Resource group name that the DevTest lab exists in.
  • ResourceId
    • Fully qualified virtual machine resource.
  • ResourceUId
    • Unique virtual machine GUID identifier.
  • Name
    • The name of the attached disk
  • CreatedTime
    • The date and time on which the data disk was created.
  • DeletedDate
    • Date and time on which the data disk was deleted.
  • ResourceStatus
    • Resource status.
    • Active if the resource exists, Inactive when deleted.
  • DiskBlobName
    • Blob name for the data disk.
  • DiskSizeGB
    • The size of the data disk.
  • DiskType
    • Type of disk.
    • 0 for Standard, 1 for Premium.
  • LeasedByVmId
    • Virtual machine resource identifier which the data disk been attached to

Additional information

If you are dealing with multiple labs and want to get overall information the two key columns are the LabUID and the ResourceUId which are the unique ids across subscriptions which can be keyed on.

I’m going to follow up with a few more posts that go beyond this and get into details about moving this data into either a SQL database or a DataLake storage to help visualize the information with PowerBI.

 

Hope this helps.

Roger Best

 

Roger Best, Senior Software Engineer

Roger is part of the Visual Studio and .NET engineering team focused on Visual Studio and Azure customers.  He has been at Microsoft for 19 years, focusing on developer technologies for the past decade or so.  In his spare time, he watches too many movies, and tries to survive triathlons.

 

 


Comments (0)

Skip to main content