I recently received the same request from two customers, so I felt maybe it might be a good topic to discuss here so others can take advantage of it as well. The request was as follows: The customers wanted a way to track active usage of the Views in their system to find out which ones actually got used. They can use this information to deactivate unused Views, and consolidate their list of views for each entity to only the ones needed by their users.
In order to help accomplish this goal, I'm going to use an asynchronous Service Bus plugin registered on the Retrieve message for the SavedQuery entity. This will tell us every time we retrieve a view definition, which should only happen when a user clicks a view from a view picker or through advanced find. There will also be times when the view definition has already been retrieved and is cached locally, so we'll essentially be tracking "cold loads" of Views, or the first time they are retrieved in a browser session per user.
This article will have a very similar alternative that I created for customers who prefer Log Analytics to Application Insights. The alternative uses a Logic App in Azure to grab the message from the Service Bus Queue and push the data to log analytics.
Identify views with the most traffic/requests, so that other unused views can be deleted and highly used ones can be optimized.
- Register Service Endpoint message on Retrieve of Saved Query entity in CRM. This will asynchronously post the execution context containing the view data to a Service Bus Queue/Topic, where it can be retrieved by a Logic App.
- The Logic App will parse out the relevant data (Entity Name, View Name) from the execution context, and pass to an Azure Function which will insert it into an Application Insights Tenant where it is logged and can be reported on.
- Service Bus Queue created in an Azure subscription, need the connection string for step 2b.
- Create Service Bus Queue or Topic
Register Service Endpoint in the CRM Plugin Registration Tool
- Register->New Service Endpoint
- Paste in a Connection string retrieved from the Azure Portal
- On the next screen, Change the Message type from .Net Binary to JSON, Enter the Queue or Topic Name
- Click OK
Attach a message processing step to the new service endpoint in the Plugin Registration Tool
- Register->New Step
- In Message, enter Retrieve
- In Primary Entity, enter savedquery
- Change Execution Mode to Asynchronous
- Click Register
Create an Azure Function App to help translate the JSON from the plugin
- In the Azure Portal, click New->Serverless Function App
- Give the App a unique name, Resource Group, Storage Account
- Click Create
- Click the +/Add button, add a new HTTPTrigger function
Use this code for your function:
private static TelemetryClient telemetry = new TelemetryClient();
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
dynamic data = await req.Content.ReadAsAsync<object>();
//log as much additional information from CRM as we can for auditing
//we can get CorrelationId, which ties directly back to the plugin
//execution and is also useful for Microsoft support to have
//UserId could also be helpful so you can tie a view retrieve directly
//back to a user in case you want to find out why they use that particular view
//giving a static Operation Name string will allow you to quickly filter
//down results to this type of operation if your Application Insights instance is heavily used
telemetry.Context.Operation.Id = data.CorrelationId.ToString();
telemetry.Context.User.Id = data.UserId.ToString();
telemetry.Context.Operation.Name = "View Accessed";
string target = data.Target.ToString();
KeyValuePair<string,object> entity = JsonConvert.DeserializeObject<KeyValuePair<string,object>>(target);
List<KeyValuePair<string,object>> entList = entity.ToList<KeyValuePair<string,object>>();
Dictionary<string,object> entDict = entList.ToDictionary(k=>k.Key,v=>v.Value);
string newJson = JsonConvert.SerializeObject(entDict);
telemetry.TrackEvent(entDict["returnedtypecode"].ToString() + " - " + entDict["name"].ToString());
return req.CreateResponse(HttpStatusCode.OK, newJson);
- Create a new file in your project by expanding View files on the right, click Add, name the file project.json
Open project.json and add this code:
- The above code will tell the Azure function to download a nuget package for Application Insights.
Now we can start to test the functionality, to start, login to CRM, Navigate to an entity, change the view
- You can monitor the console in your Function App to see if any errors occur
Start reviewing results in Application Insights
- In the Azure portal, find Azure Functions and choose the Function App you created for this exercise.
Click Application Insights
- From here you can click Analytics (small button in the ribbon), then click the + new tab button
- Intellisense is very good so as you keep typing you can tab to complete your entries
- Here is a sample query to display the top views in order in a bar graph format:
| where timestamp >= ago(30d)
| project name
| summarize count() by name
| order by count_ desc nulls last
| where count_ > 2
| render barchart
- The first line is the "table" name if you were comparing this query to a SQL query
- The next lines all begin with a pipe (|) operator which is just syntax, after that more querying keywords are specified. "where" is just like SQL, specifying a filter clause
- |project col1,col2,col3 specifies the columns to retrieve, like a "select" in sql. Omitting the project line is fine to retrieve all columns
- Comment lines out with // to try omitting various lines
- Functions help with dynamic time operations, like the ago(30d) function to only look back 30 days of logs, you can also use "m" for minutes "h" for hours, "d" for days
- |where count_ > 2 tells the query to forget about the views that only have 1 or 2 views and filter these out
- |summarize is the group by operator equivalent. In summarize you can use aggregates like count() max() avg(), followed by the keyword "by" which specifies columns to group on.
- |render barchart makes the output a graphical format, omitting makes it a table.
- Here is a sample output: