IOT for mere mortals, Part II .


IOT for mere mortals , Part II

Part I of this article describes IOT in general and Microsoft PaaS offering around it in more general terms.
Take a look if You feel the need for some theory: Part I

Sorry it took me so long to get back to this, but time flies when youre having fun ...
Anyways, we are here now.

On the menu today :

  • Simulating events with nice gui and generating credible traffic
  • Designing your data so that it plays nicely with the storage infrastructure
  • Storing data in several places and forms at one go
  • Retrieving data from relational database (Azure Sql) to webApp and graphing it
  • Retrieving data from Azure Tablestore to webapp

Set up

The basic structure of an IOT solution in "Azure PaaS"-land usually consists of following things:

  1. Someone who knows how to send event using either AMQP or REST over https
    1. Our simulator is using AMQP and threading for efficiency
  2. Provisioned PaaS infrastructure to catch the traffic
    1. Event Hub service in our Azure subscription
  3. Optional real-time analytics and processing
    1. Averaging the events in 30 second windows
  4. Permanent storage on one or more places
    1. Storing averaged events to Azure Sql and storing all (unsummed) event to Azure Blob Storage and Azure Tablestore
  5. Some sort of client to see the data
    1. Our web app which does rudimentary visualization of collected data

We are intentionally not doing any higher level analytics or Machine Learning here. We are leaving that for the possible sequel: "Part III"

 Sending events in - the simulator

 

I programmed the simulator as a WebApi-project in VIsual Studio. I wrote a bunch of REST-Services and a HTML5/Jquery- interface on top of it.
You might prefer some other technologies for client but I do mine like this since I can whip them up pretty fast and they behave quite ok. (and i know HTML and Jquery very well)

Basically the whole program is a no-brainer but I'll go over some of the most interesting bits here.
First of all we need to have some structure that we are sending back and forth, here's mine:

public class MeasurementEvent : TableEntity
    {
        [DataMember]
       public long deviceid { get; set; }

        [DataMember]
       public DateTime arrivaltime { get; set; }

        [DataMember]
       public long temperature { get; set; }

        [DataMember]
       public long pressure { get; set; }

        [DataMember]
       public long vibration { get; set; }

        [DataMember]
       public long value { get; set; }

        [DataMember]
       public string type { get; set; }

        [DataMember]
       public long serial { get; set; }

       public MeasurementEvent()
       {
       this.serial = AppModel.getSerial();   
       }
    }

As You can see I'm using the same structure to carry all the event types in my demo. In real life You might be using few types instead.
The other notable thing is that the class is decendent from TableEntity, it doesn't have to be, but I'm doing it because I'm later on reusing this same class when querying the TableStore for events and LINQ wants it like this.

Here is my sender-function that is being called once a second when the simulator is active :


(sorry for including this as an image but the editing system screws my VS-pasted code blocks pretty good, I'll put the code in resources at the tail of this article)

The SensorData is just a structure to convey the HTML-fields from the screen to the backend .So nothing special there. The randomizer is a simple block generating some randomly drifting values. The dice generates (-1,0,1) and the values are drifting accordingly:

The timing is implemented by the client using good old window.setInterval since it doesn't have to be rock solid. Add some basic field checks and the simulator is pretty much a done deal.

 Catching the events

For instruction on how to setup the infrastructure for Azure Stream Analytics please see Part I of this article. I'll show here only the most interesting bits

Stream analytics dashboard:


Averaging query:

 Basic query for storing into Blob storage:

So at runtime we have three active jobs consuming the same input. Pretty cool, it just works that way.
Maybe one thing worth noting in Tablestore output definition is that the fields "Partition Key" and "Row Key" are referring to column names in Your input, not values.

Here's a piece of data from blob storage:

After adding a terminating "]" this is basically a valid json file that can be used by tools like HDInsight (Hadoop) or Azure Machine Learning (we might look in to that in Part III).

TableStore looks like this:

This is a more traditional view of the same data.
Actually this is newer picture and there are a couple fields worth mentioning. I added a thing called avain (key in Finnish) which is a decending number since there is no "order by" when selecting rows from the TableStore which just gives You rows starting from the first one. So making the key decending turns the default sorting (PartitionKey,RowKey) upside down so that when I take only 40 lines I get the latest ones.
Other thing to note is that the PartitionKey is used as the datatype label. Very much like table name would be in relational world.

Here's looking at You kid

I wanted to view how my pumps and ekg-monitors are doing so I needed to build a little dashboard to show the latest values coming from the relational db.

Remember that it was averaged data in db so we are looking at trends here, not actual line data.

One can go totally crazy and build wild visualizations of this data using BI-tools or whatever. I'm using just HTML and flot here. There's a little REST-function on the backend that spits out lates monitoring data from tga db and jquery feeds it to flot diagrams. Easy as pie.

Another view of the same data comes from TableStore :

Here we are presenting all the latest rows unsummed which is a good starting point for later analysis.

The TableStore is approached by this little code snippet:

 Note the usage of the same MeasurementEvent-class we used sending data in Event Hub . (the full code is in resources)

So there You have it !

This is how You programmatically create events, send them , capture them, analyze them and finally show them in various forms.
What's missing is the business case here and the real value that comes from applying analysis to this data.
We'll probabply look into those tools in the next part of this article.

Get the code resources to help Your typing.

 


Comments (1)

  1. jra says:

    This is good as is it, but not granted black max brown granted :).

Skip to main content