SQL Azure Blob Auditing Basic Power BI Dashboard


We have created a simple Power BI template for viewing Blob Auditing data.

You can download the template here.

This template is based on the query:

SELECT * FROM sys.fn_get_audit_file('https://<storageaccountname>.blob.core.windows.net/sqldbauditlogs/<servername>', default, default);

Which will let you process your blob auditing files from T-SQL.

To use the Power BI report first download Power BI Desktop https://powerbi.microsoft.com/en-us/desktop/

Once installed open the template file from above first you will get prompted to fill in the parameters, please fill in your SQL Azure server name (without .database.windows.net) and your Auditing storage account name.

Next you will need to provide permission to run the query against your database, click Run to continue.

Finally enter your database admin credentials (Azure Active Directory Authentication is not currently supported). Make sure to select Database on the left hand side before entering your credentials and clicking Connect.

Now you can use the report through Power BI Desktop. If you would like to utilize this via the Power BI service and enable auto refresh there are a few more steps. First click the Publish button under the home tab. Save the report, the name you use will be published to Power BI.

Select the workspace you would like to publish to and click Select.

You can now click the link provided to open the report online.

To enable scheduled refreshing of the data go to Datasets and click the Scheduled Refresh button for the dataset.

You will see an error that your credentials are invalid, click the Edit credentials link.

Change the authentication mode to basic, enter your credentials, and then click the Sign in button.

Expand the Scheduled refresh section, toggle Keep your data up to date to On, click the Add another time link and select a time for the refresh, finally click the Apply button.

You are now set up to have the data automatically refresh for the report.

 

Comments (5)

  1. Hanno says:

    What can I do if the connection keeps timing out? I’m not able to get past a screen that says “Refresh (title) All audit data – Evaluating …
    The amount of audit data is huge (about 1TB per day, we have a retention policy of 14 days) so without some tooling it’s impossible to do anything meaningful with the audit data

    1. Hanno says:

      mistake, it’s 1TB of data for the main database in total, not per day

    2. The primary way to resolve this would be to filter based on event_time to query a smaller amount of time which should allow for you to load the data.

      1. Darren says:

        Hi Cameron, filtering after the fact won’t stop the 1TB of log downloads though right? Is there not a way to alter the Query to make it more selective? Specifically, when you’re dealing with Elastic pool databases, not being able to do this is a problem.

        1. You are currently correct, the only way to filter it specifically would be to point to individual files or to push down further into the directory structure.

Skip to main content