Monitoring the Nuget feed using Microsoft Flow


As the author of a few packages on nuget.org I often find myself needing to know if/when any of the dependencies of my packages are updated. Since nuget doesn't (readily) have an RSS feed - and even if it did that would only alert me to ALL nuget package changes - I started thinking about a way to do this during a recent hackathon with a partner who expressed a similar need.

Could we find a way to monitor the nuget feed for updates to a certain package and then send an alert somewhere when one is encountered?

Let's find out.

Step 1 - decipher the nuget API

Using Fiddler while running Visual Studio and searching for packages in the Nuget Package Manager, I found the API call used by the package manager hits this endpoint:
https://api-v2v3search-0.nuget.org/query?q=<search criteria>
and if you specify you want to include pre-release packages (we would in our case), the endpoint changes to:
https://api-v2v3search-0.nuget.org/query?q=<search criteria>&prerelease=true

The interesting point was trying to get the search to land with only one result when I give it a specific package id. The Nuget Package Manager Console will do this if you specify -ExactMatch when using the Find-Package cmdlet. Running that with Fiddler active, though, showed that it's gotta be client code narrowing down the result to the exact match, not any part of the API query. So we'll have to do similar work on our end to make this happen.

Step 2 - poll the Nuget API with our package's ID as the search criteria

But how should we poll it? Shall we write code? I decided to see what I could accomplish with my old friend, Microsoft Flow and, as it turns out, you can set up Flows to reoccur at any interval down to 1 minute; perfect. Let's check it out.

Start a blank Flow template

Set up polling interval

For the first step, this is where we'll specify how often we should poll the Nuget api. Search 'recurrence' and choose Schedule - Recurrence:

Give it a reasonable time, say, 6 hours

Specify the endpoint to poll

Flow has a number of cool features when it comes to interacting with HTTP endpoints. Not the least of which is the ability to simply hit any HTTP endpoint with a verb and get the response back. Of course this is what we want to do with our Nuget polling, so click 'New step', choose 'Add an action', and search HTTP

choose HTTP - HTTP

and fill it out based on what we learned about Nuget's Web API:

Parse the results of the nuget query

Now's where we get in to the really powerful part of Microsoft Flow. Since it's built on top of Logic Apps there are some very code-like things we can do. One of these is actually parse JSON out in to an object you can use throughout subsequent steps. Let's set our flow up to do this so we can iterate through the results of our query.
Again, click 'New step' and choose 'Add an action'. This time, type JSON in the search box and choose Data Operations - Parse JSON

For the content of this action, we give it the Body of the result that came back from our HTTP call by choose it from the dynamic content window:

To get dynamic content options for the response later on, you have to give it a schema.

The super cool thing is Flow lets you give a sample of a response and it'll infer & create the schema for you. Since we've already hit the Nuget API outside Flow to test things out, we just so happen to have such a sample response. Click the Use sample payload to generate schema link and copy in a response from the Nuget API & click 'Done'. You'll see the schema populated based on your sample.
For ease of following along, here's a sample JSON you can use for this step based on our Bot Builder nuget package:

{
    "@context": {
        "@vocab": "http://schema.nuget.org/schema#",
        "@base": "https://api.nuget.org/v3/registration2/"
    },
    "totalHits": 19,
    "lastReopen": "2017-07-13T03:36:24.4160460Z",
    "index": "v3-lucene2-v2v3-20170524",
    "data": [
        {
            "@id": "https://api.nuget.org/v3/registration2/microsoft.bot.builder/index.json",
            "@type": "Package",
            "registration": "https://api.nuget.org/v3/registration2/microsoft.bot.builder/index.json",
            "id": "Microsoft.Bot.Builder",
            "version": "3.8.5",
            "description": "there will be a description here",
            "summary": "there will be a summary here",
            "title": "Microsoft.Bot.Builder",
            "iconUrl": "http://docs.botframework.com/images/bot_icon.png",
            "projectUrl": "https://github.com/Microsoft/BotBuilder",
            "tags": [],
            "authors": [
                "Microsoft"
            ],
            "totalDownloads": 156038,
            "versions": [
                {
                    "version": "1.0.0",
                    "downloads": 949,
                    "@id": "https://api.nuget.org/v3/registration2/microsoft.bot.builder/1.0.0.json"
                },
                {
                    "version": "1.0.1",
                    "downloads": 1295,
                    "@id": "https://api.nuget.org/v3/registration2/microsoft.bot.builder/1.0.1.json"
                }
            ]
        }
    ]
}

Now we'll have the ability to target any property of the response from Nuget in our later steps.

Target only the package we're after

As you've likely seen if you ran a couple of the API queries on your own, even if you type in a full package ID you still get back results that aren't that exact package. So, we need to do things only for that package (much like -ExactMatch does for the Package Manager Console, you recall). To do this, we'll use a conditional block in Flow, but let's watch what happens when we try to target the id property of one of the packages in the data array of the response.
Below our new Parse JSON action, add a new conditional step:

in the Choose a value box, select the dynamic content object of id (corresponding to the id property of the JSON parsed in our Parse JSON step):

Watch how Flow recognizes you chose a property that's on an object within an array of the response and immediately places an Apply to each | data block around your conditional!

Now, pick your jaw up off the floor & finish filling out the condition:

Compare the package's value with the last one we saw

Here's where things get tricky. Flow executions can't/don't maintain state between their runs. So how could we know that the package version we just pulled is/isn't the same as the last time we ran it? Enter Azure Storage. We'll use Blob Storage to store off exactly this information, and use it within our flow.

Set up some Azure blob storage

// todo: insert setup here

Read a blob from storage as part of our flow

After we put in the Condition, we now have two branches under it. IF YES and IF NO. Since we don't care about packages whose id doesn't exactly match ours, the remainder of the work will be done in the IF YES area.
Click the Add an action button and type 'blob'. Select Get blob content from the resulting list

When you first add this action, you'll fill it out so that it sets up a new connection to Azure Blob Storage:

then fill it out so that it pulls from a blob whose name makes sense. For me, I made this <package id>.lastversion:

Parse the contents in to something actionable

Once you have the blob content, we need to read its value. Things get a little strange here. Flow requests the value from blob storage and it comes back as a base64-encoded JSON payload that looks something like this:

{
  "$content-type": "application/octet-stream",
  "$content": "My44LjU="
}

So we need to base64 decode the $content property of this payload. Again, since Flow is built on Logic Apps, we have the power.
Add a new Compose action to our flow like so:

and configure it with the value "@base64toString(body('Get_blob_content').$content)" Be sure to include the double quotes in this value. Note that if you changed the name of the "Get blob content" action above, you'll need to make sure the name matches in the value for the Compose action.

Do the comparison

next, we compare this value to the current version that came back from the nuget feed by adding a condition to the flow:

where Output is the output of the Compose action and version is the version property of the data item we're processing in the Apply to each iterator.

Send notification of an update

Now all that's left to do is send a notification and update the blob (so we don't get another notification the next time it's checked) when this condition is true (ie: so we have an updated Nuget package). Both of these steps are going to go in the IF YES branch under our new version comparison condition.
To send a notification, it's pretty simple. Click Add an action, search for 'notification', and choose 'Send me a mobile notification' (or you could chose the e-mail one if you like). This will send a push notification to your mobile device via the Flow app (so you must have it installed and logged in first).

Configure the notification however you like. Remember you can include values from the package information in your notification!

Next, let's update the blob in blob storage with this new version number so we don't get alerted again for the same version. Add another action, search for 'blob' and choose 'Update blob':

Configure it to update the blog you read earlier, with the version value from the package:

Make it robust

First-run considerations

On first run of your flow, the blob won't exist in storage so you'll get an error back from Azure Blob Storage saying it doesn't exist. Of course, Flow has you covered.
Click the ... in the top right of the Compose action you created to parse the output from the blob. Choose 'Configure run after' and check the box 'is successful' and click 'Done'.


Now our parsing of blob content will only run if we're able to get a blob from storage. Perfect. But what about the other case? To add this, we have to add a "parallel branch" under our Get Blob Content task like this:

for that action, we'll just create the blob in blob storage with the value of the version that came back from Nuget:

and, of course, set it to run only if the fetch from blob storage fails:

What have we learned

Microsoft Flow provides you with a ton of great building blocks for things you might want to automate quickly & easily with no need for code, servers, cloud computing, or even a credit card!
The next time you're thinking about automating a particular set of actions, check to see if Flow's serverless offering fits your bill; you might be pleasantly surprised.

Up next

Could we do it all in a simple Azure Function? Stay tuned!

Comments (0)

Skip to main content