As a technology and data lover I felt it was a bit strange that keeping up-to-date with the usage of electricity, gas and water consumption in my house was an after the facts exercise. There certainly are a lot of talks about Smart Meters but they haven't rolled out yet here and when they do the question is how fine grained we will get the data as a consumer. Hence, I started looking for a way to get more real-time insights into this.
There are several solutions on the market and in the end I decided to buy a Flukso. Flukso is an open-source monitoring solution developed in Belgium. It comes with a base module to which the different sensors can be connected. What was nice is the fact that they also support MQTT next to their REST API.
It also comes with a simple WebUI but of course I wanted to build something myself, so I have more freedom to explore the data.
Before I started building the solution I set out a few goals for myself:
- It needed to be portable enough so I can run it on my Raspberry PI
- Since Flukso is open source I want to make my code available publicly too
- In order to make sure other people can benefit from it as easily as possible, I wanted to use Infrastructure as Code
Based on this I came to the following toolset: .NET Core 2.1, VS Code (yes I'm a fan), Visual Studio Team Services, Docker, GitHub and Terraform.
I own a Flukso, just tell me how to deploy to my Raspberry PI
If you don't care about all the code and just want to get started capturing the data you can deploy the backend using the terraform script (described below). Then download the docker-compose.yml and the fluksocore.service file to your Raspberry PI and run it. The readme on GitHub has more details.
The architecture is not too complex, I'm picking up the data, on my Raspberry PI, from the Flukso over MQTT and pushing this to Event Hubs. From Event Hubs there are 2 streams, one is using Event Hubs Capture which automatically pushes the data to a storage account (in AVRO format) and the other is using Stream Analytics to aggregate the data by minute and push it to SQL Database for reporting with Power BI.
I have heard a lot of good things about Terraform around IaC so I decided this would be a great moment to learn. I was very pleasantly surprised to see the VS Code add-ins (Syntax - Azure) and the default integration of Terraform in our Azure Shell. The Terraform script is available on GitHub and it takes care of the deployment, I'm also deploying an Azure Container Registry if you want to keep your version of the container private.
It all starts with writing your terraform file(s) (*.tf), you can define variables in a variables.tf file and finally define the values for these variables in a .tfvars file. The next step is to initialize the environment using "terraform init" to download the right provider and initialize the environment. Now you can run "terraform plan -out tfplan.out" to see what changes are going to be made and finally just run "terraform apply tfplan.out" to do the actual deployment. One of the very nice things is the "terraform destroy" command, this cleans up all the resources. Probably not the best thing to do in production but very useful during development. Since the Azure Shell is also integrated in VS Code I can just do all this from within my development environment.
To do all this from VS Code you will need to install a few Extensions., here is the list of the Extensions I installed. If you are using Windows and want to use the Azure Shell from VS Code you will also need to install Node.js 6 or later (https://nodejs.org).
Use CTRL-SHIFT-P to launch Azure: Sign In, then select the right subscription with Azure: Select Subscriptions and finally use Azure Terraform: Init command. Next to the Terraform files, you will have to copy the SQL folder to the cloud shell or wherever you want to deploy from too.
After you have executed the "terraform apply" command it should have created a Resource Group (default is RG-Flukso) with the necessary services.
Now that we have the supporting services running on Azure we can start looking at the code. As mentioned I used .NET Core 2.1 to make it portable and use Docker to run the actual application on my Raspberry Pi 3.
Since Flukso has an MQTT endpoint I am adding a NuGet package called MQTTNet to my project, I'm doing this in the Integrated Terminal of VS Code by executing the command "dotnet add package MQTTNet". Next to this I'm also adding some packages for configuration, EventHubs and Newtonsoft.Json. If you download the code from GitHub this is all done of course, you just need to run "dotnet restore" in the folder where you clone the code.
Next I needed to Dockerize the application for easy deployment across different devices. You need to pay attention to the runtime version you select to make sure it can run on ARM. In the backend I'm using Visual Studio Team Services to build and push the container image to DockerHub whenever I push updates to Git.
The code is available on GitHub @ https://github.com/wesback/fluksocore. I still need add my Azure Stream Analytics code. I will update the post as soon as finish that part. You can write a pretty straightforward query to output the data to Azure SQL Database, visualizing the data in Power BI is a logical next step. You can already find an example on how to write a query at https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-build-an-iot-solution-using-stream-analytics.
As always there are still a few things I want/need to do when I find some time. Currently I'm not doing any unit testing (shame on me) so I definitely need to add those. Another improvement I would like to do is switching to IoT Edge and IoT Hub, that would allow me to already do some preprocessing on the Raspberry before sending the data to the cloud.
Once I have captured enough historical data I could start using this to start predicting my consumption. Hopefully it can also learn me something about (bad) habits to lower my energy consumption!