Browse Category

Azure IoT

Alert engine using Azure Stream Analytics and SQL Azure as Reference data

This article is an extension of some great work link and link .

As part of any IoT solution or workflow you would need an alerting engine.Luckily Azure provides us Azure Stream Analytics which can be used to create an alert engine.

There are quite a few articles available on how to set Azure Stream Analytics job and configure Input and output.I am making an assumption that readers know how to set Input and Output in Azure Stream Analytics.We won’t be going in detail on how to set them. The area which we would look into is how to set an alert when telemetry or metric which devices transmit is not constant and can change.Say “device1” transmit (temperature and humidity) while “device2” sends us details about voltage and current.We can’t have separate job for all these devices with separate rules defined.

The best thing in Azure Stream Analytics to handle these kind of cases is to define reference data.In this article we would see how we can use SQL Azure as our Reference data and create alert engine for devices .

We would start by creating a reference data in SQL Azure table.Lets say we have a rule defined in JSON format like this:

[ { “PropertyName”: “Temperature”, “threshold”: 40.0, “comparisonoperator”: “>” }, {“PropertyName”: “Temperature”, “threshold”: 20.0, “comparisonoperator”: “<” } ]

Here we have defined two conditions for temperature. Threshold is set to 40 and 20 respectively for operator “>” and “<“.It is pretty much clear that we want alert to fire once Temperature crosses 40 and falls below 20.

Now lets see the how we are going to achieve it now.

CREATE TABLE [dbo].[RulesContainer](
[RuleId] [bigint] IDENTITY(1,1) NOT NULL,
[DeviceId] [nvarchar] (125)NULL,
[SensorRules] [nvarchar] (1000) NOT NULL,
[RuleId] ASC
PERIOD FOR SYSTEM_TIME ([ValidFrom], [ValidTo])
SYSTEM_VERSIONING = ON ( HISTORY_TABLE = [dbo].[RulesContainerHistory] )

ALTER TABLE [dbo].[RulesContainer] WITH CHECK ADD CONSTRAINT [SensorRules_JSON] CHECK ((isjson([SensorRules])>(0)))

ALTER TABLE [dbo].[RulesContainer] CHECK CONSTRAINT [SensorRules_JSON]

We would first start by creating a temporal table in SQL Azure database.Note that the mandatory elements of every temporal table are the PERIOD definition and the SYSTEM_VERSIONING clause with a reference to another user table that will store historical row versions.

We would then go to Azure portal and create a Stream Analytics job.Under input please add SQL Database as Reference Input.While adding SQL Database as reference data please enter a query that would be used to create snapshot.In our case it would be

select RuleId,DeviceId,TenantId,SensorRules
from dbo.RulesContainer

In any IoT solution the rules would needed to be updated.For example we would begin with setting temperature threshold to 40 for some devices. Later we may add some more devices in our factory or site and would need to update rules with these devices. Here is the setting which helps to refresh updated rules in ASA job periodically. Please select the option to refresh it “ON” and enter the duration.

Now the most important part of our solution which is the query.

RuleId bigint ,
DeviceId nvarchar(max),
SensorRules array

WITH TelemetryAndRules AS
ref.RuleId as RuleId,
T.DeviceId AS DeviceId,
  sensorRules.ArrayValue.PropertyName as measurementname,
sensorRules.ArrayValue.threshold as Threshold,
sensorRules.ArrayValue.comparisonoperator as Comparisonoperator,
GetRecordPropertyValue(T, sensorRules.ArrayValue.PropertyName) as MeasurementValue,
System.Timestamp() as ProcessedTime
[zainhubinpout] T PARTITION BY PartitionId
TIMESTAMP BY T.EventEnqueuedUtcTime
JOIN [sqlref] ref ON T.DeviceId = ref.DeviceId
CROSS APPLY GetArrayElements(ref.sensorRules) AS sensorRules
AggregateWindow AS (
TR.ruleid as ruleid,
TR.deviceid as deviceid,
TR.MeasurementName as measurementname,
TR.Threshold as threshold,
TR.comparisonoperator as comparisonoperator,
AVG(TR.MeasurementValue) as avgmeasurementvalue
TelemetryAndRules TR PARTITION BY PartitionId
TumblingWindow(minute, 15)
ruleid as RuleId,
deviceid as DeviceId,
measurementname as PropertyName,
threshold as Threshold,
comparisonoperator as ComparisionOperator,
avgmeasurementvalue as MeasurementValue,
System.Timestamp as ProcessedTime
INTo sqloutput
FROM AggregateWindow PARTITION BY PartitionId
NOT(threshold IS NULL) AND
(comparisonoperator = ‘>’ AND
(avgmeasurementvalue) > threshold) OR
(comparisonoperator = ‘<‘ AND
(avgmeasurementvalue)< threshold) OR
(comparisonoperator = ‘=’ AND
(avgmeasurementvalue) =threshold)

What this query is all about.At first we declare a table which would tell ASA the datatype of columns which is used in our reference table.One thing to note is SensorRules is defined as array datatype because we are using GetArrayElements function which would return dataset with array values.The usage of this function is documented here .

Once we have the values then next part of query deals with Windowing Function.We set the Tumbling Window to be 15 minutes.You can choose the window as per your convenience. At last we compare the value with our ruleset.

So how our table in which we save our Alerts looks like.

CREATE TABLE [dbo].[TelemetryAlerts](
[AlertId] [bigint] IDENTITY(1,1) NOT NULL,
[RuleId] [bigint] NOT NULL,
[DeviceId] [nvarchar] (125) NOT NULL,
[PropertyName] [nvarchar] (200) NOT NULL,
[Threshold] [float] NOT NULL,
[ComparisionOperator] [varchar] (10) NOT NULL,
[MeasurementValue] [float] NOT NULL,
[ ProcessedTime ] [datetime] NOT NULL
[AlertId] ASC

A better option would be to store these Alerts in CosmosDB or Table storage but its as per design of application.We selected SQL table for the sake of simplicity.

Near Real Time IoT data exploration using Azure Time Series Insights

Azure Time Series Insights (TSI) is a service in Azure which can be used to store, query and visualize time series data from IoT sensors. There are implementations where an Azure Stream Analytics job would process and then port telemetry values of one or more sensors from Azure IoT hub to a document store (something like Azure CosmosDB) for further visualization. With Azure Time Series Insights architectural implementation is simplified as Azure TSI does most of these heavy lifting for us. Azure Stream Analytics job can always be integrated with IoT Hub in case application demands any other analysis or offline reporting .You can read more details about Azure TSI here.

“Azure TSI has both GA and Preview version.The one which we would be testing is preview version”

This article is divided into three section:

  • Creating resources for Azure Time Series Insights.
  • Ingest telemetry from NodeMCU to Azure IoT hub as needed by
    Azure Time Series Insights.
  • Reading telemetry from Azure Time Series Insights using C#.

Creating resources for Azure Time Series Insights

We would first start by creating an Azure IoT Hub which would be our data source and place it in a new resource group for better maintainability.

Once our IoT Hub is created we can proceed with creating Devices in that. Let’s keep things simple for testing and select the default options.

Next, we proceed with creating a Consumer Group. Consumer groups are used by applications to pull data from Azure IoT Hub. Consumer groups enable multiple consuming applications to have a separate view of the event stream, and to read the stream independently. Lets create an additional consumer group called “tsilearningcosumergroup” so that TSI can pull data using this consumer group.

Now, let’s start creating Azure TSI resources. Doing a search for time series in All services pops us these options

Once we have our data source set we can then proceed with creating Azure Time Series Insights resources. First what we would create is Time Series Insights environment. Please select this option and give the required parameters. For the preview model lets select PAYG pricing plan.

One of the important field to choose is TimeSeriesID which would act as partition key for your data. As with all database choosing partition key is the key for any application design. I would suggest to pick up a field which would be used to quickly identify the record. In my case deviceid suited well.

Azure Time Series Insights needs a storage account. You would see later that it saves these data in a file with .parquet extension in the blob container created under this storage account.

Next is to create Azure TSI event source. Technically speaking it has more to do with mapping our environment.Here we would give a name for our event source and select the source type which is IoT hub . This can be an EventHub as well. Further, we specify that we are going with an existing IoT hub which was created before and select the hub name and set the required access policy as well.

We discussed briefly on consumer groups at the starting of this article. While creating event-source we need to select our Consumer group .

Next is to set access permission to our Azure TSI environment. By default creator of environment would be having Reader and Contributor access. By setting data access policies we give an external application access to read data from our Azure TSI environment. We would see that in detail later while we work on reading Azure TSI data from C# code.

A quick recap of all resources

Ingest telemetry from NodeMCU to Azure IoT for Azure Time Series Insights

Azure TSI has two requirements when it comes to format and required field. The data shape has to be JSON since Time Series Insights supports only JSON data.Messages should have Time Series ID property specified at the time of provisioning which in our case was deviceid.Its good to pass timestamp in message but that isn’t mandatory.

A template script which can be used with ESP8266 is uploaded here . The only change in this script compared to the previous one is to use hardcoded values for temperature and humidity instead of reading from sensors since we are more interested in checking how to send data from ESP8266 so that it can be consumed by Azure TSI. Let’s go ahead and flash our MCU.

Assuming that ArduinoIDE has all libraries needed (which are included in header) and the correct board is selected we would see messages are being sent to Azure IoT Hub.

Please note the format of message which is JSON.I normally use JSONLint to validate the format of JSON message being sent and its valid so that I don’t spent time later troubleshooting on why data is missing.

At this point we can go to azure portal and select the TSI environment that was created.Azure TSI comes with an explorer tool which can be to explore our time series data.

Screenshot above shows url of Time Series Insights explorer.If we browse this url we can see the timeseries data.Please note our DeviceId is mapped to instances here.

So we have seen how to set Azure TSI environment and how we can ingest and view the data in Azure TSI.Next we would see how can we expose timeseries data using API which TSI provides and consume the same with a C# application and define aggregation on the same.

Reading telemetry from Azure Time Series Insights using C#.

To consume telemetry data from Azure TSI first we must set up an Azure Active Directory application and assign the data access policies in the Time Series Insights environment.The steps are fairly documented here .

We would be designing an ASP.NET Core Web API which would make call to Azure TSI and get the latest data about our sensors .Lets begin by installing “Microsoft.IdentityModel.Clients.ActiveDirectory” nuget package in our project.We would also assign few values to our variables which would hold secrets and configurations.Though not the best way to maintain your secrets and configurations values but for a demo it should be fine.We need EnvironmentFQDN which we can from our newly created enviroment from Azure Portal.Please look out for Data Access FQDN in Azure TSI environment.We would need our ActiveDirectory tenant along with ApplicationId and ClientSecret of the application which was registered in Azure Active Directory.

The code is uploaded here. The controller class (ValuesController )does most of work for us.Again this doesn’t follow any strict coding guidelines and practices it just explains you the basics of TSI.We pass the Telemetry details or the metrics for which we need data in an array.In real world scenario DeviceTwins can be used for this purpose.Then we make a GET call to fetch the data with few parameters.Please pay attention to the format of DateTime variable and timezone.It should be in UTC in this format “yyyy-MM-ddTHH:mm:ssZ”.The main part here is the method where we prepare request body of our call PrepareInput .We make a JSON and returns that as JObject which is passed as request-body in our POST.Yes its a POST to get the results.

TimeSeries Insights provides multiple API’s .You can refer to the documentation here. In this particular example we are using Aggregate Series API which enables querying data captured from events by aggregating records.

So we have seen how we can query captured data using programming language of our choice.These are plain https call so we don’t have to do anything fancy here.Azure TSI provides some additional functionality via API’s where we can provide reference data to be used for our dataset and we can create some interesting data models.We would definitely cover those in future.Keep Experimenting !

Direct Methods in Azure IoT Hub

DirectMethods gives you an option of calling methods in your device from Azure IoTHub.This feature can be used to control your devices say from turning on a LED on till  fan. Ofcourse you can work on home automation in future as well (we would see an example of that later)

For this tutorial we would be using ESP32 microcontrolller (instead of ESP8266  which comes with our NodeMCU board).ESP32  is a successor of ESP8266 and adds one more core,faster WifFi,Bluetooth and more GPIO pins.It has been nicely documented on  how to configure and use this board (here and here ).The board for ESP32 which I would be using is DOIT ESP32 DEVKIT V1 .

Code which we would be using in this post exists in my github  repo .This code was written for ESP32 but with few minor tweaks it can be tested against ESP8266 as well.

Connection remains the same as what we did while connecting ESP8266\NodeMCU with DHT11 sensor.The + and – from DHT11 sensor  should go to 3V3 and GND  pin in ESP32.The data pin should go to anyone of your GPIO pin (2 pin in our case).Then what’s the change in code?

1)#define DHTPIN 2  (The digital pin we are connected to) Here we give an integer value instead of D1,D2 etc as we did with ESP8266.

2)#include <WiFi.h>. The WiFi library to include  instead of   ESP8266WiFi.h which we were using with ESP8266 (NodeMCU).

With these small changes the same code would work for both ESP8266 and ESP32 boards.

Lets see what our code does.If you go back to previous post the code mostly remains same (connecting to Azure IoTHub ,reading value from DHT and then sending it to Azure IoT Hub).Additional methods introduced are deviceMethodCallback,start and stop method.

deviceMethodCallback accepts a char and checks what is the name of method which is being called.In our case we would keep it simple without any payload and call the start and stop method.Start method sets a boolean value messageSending to true and stop sets this value to false.We control sending temperature and humidity value to AzureIoTHub using these boolean value in this if block

if (!messagePending && messageSending)
char messagePayload[Messagemaxlen];
readMessage(messageCount, messagePayload);
sendMessage(iotHubClientHandle, messagePayload);

After downloading the code please edit these values as per your settings and then flash ESP32 or ESp8266 board.

If everything goes well you can see Temperature and Humidity data being sent to Azure IoT Hub as we saw in previous post.Next is how can we call our direct methods from Azure IoT hub.

For that you can go to your hub in Azure portal and click on IoTDevices and select your device as shown below

Once you select your device which was registered in portal you can see an option of DirectMethod

Click on that option which would give us  a nice tool using which we call methods in our devices.

Here in this tool you can enter start as Method Name  if you want to start sending data and stop if you want to stop sending data to Azure IoT hub from your device.Please do increase Connection Timeout and Method Timeout to  something like 90 seconds or so.Now you can call the methods in your device from AzureIoTHub as shown below.

Later we would see how to send messages from AzureIoTHub to our devices.

Send data from NodeMCU to Azure IoT Hub

In this post we would see how to send data from NodeMCU to Azure IoT Hub.We would be sending temperature and humidity data from  DHT11 sensor .Previous post explains how to measure temperature and humidity  using DHT11 sensor.Azure IoT team has done some good documentation on various Azure IoT Hub topics and the credit of this blogpost goes to this page.

I assume you already have an Azure subscription to use.If not please visit this page and signup for an account.The first step is to create an IoT Hub.Like all Azure components we can search for it from portal if not added in favorites and then create a hub.You can search for IoT Hub from All services and then proceed with creating an IoT Hub.

Creating a hub involves giving a unique name,selecting region and resource group.Please enter or select the resource group and Region. Then proceed with giving a name for our IoT Hub.On the next tab ‘size and scale’ choose your pricing and scale tier.Please select Free tier(F1) if it’s available on your subscription.

Once Hub is created we can then create a device.When I say ‘create a device’ its just a way of letting Azure IoT Hub  know about devices which would connect.We can create a new device by clicking on the Add button.

Please proceed with entering a DeviceId and then select the Authentication Type that you want your device to use while connecting to the Hub.By default symmetrical key is selected which can be changed to X509 certificate if you intend to use a certificate while authenticating your devices.We would cover that later once we see how to use Device Provisioning Service (DPS) in Azure IoT.For now let’s select symmetric key and have the option Auto-generate-key checked which would generate the keys for us.

Click on the device created to see the details about keys and connection string.

This shows details about the device which we just created.It shows ConnectionString info as well.Please make sure you keep this information secure and don’t share it in public since this is the id which your device would use to connect to Azure IoT hub. You can see the options like Message to Device, Direct Method,Device twin etc., which can be ignored for now.We would cover these later  in our next post.

Here you go.You have just created your device in Azure IoT Hub.Next is to make our device talk with IoT Hub we have just created.

You can go to my github repo and clone  it. The folder “dht_temperature_AzureIotHub_New” contains three files.Please open “dht_temperature_AzureIotHub_New.ino” file in ArduinoIDE which should load all the three files as shown below.

Header file “dht_temperature_AzureIotHub_config.h” has details  about connection string and WiFi SSID and password.Please update these config and  then flash your NodeMCU .Please make sure you are giving updated connection string in  IOT_CONFIG_CONNECTION_STRING.

You can open Serial Monitor from Arduino IDE to view the details of data being send to Azure IoT Hub.


Now I would introduce you to another tool called Device Explorer which I find very useful to monitor and do some basic operations in Azure IoT hub. You can read more about this free  tool from link  and download the setup from here.

This tool runs on your  machine and connects to  IoT hub in Azure. So you need to get the connection details which tool would use to connect to your hub.For that we need to go back to Azure portal and copy the connection string value for iothubowner from Shared access policies in Azure IoT hub.Please be aware of the fact  that this is like admin account so watch out before sharing it with someone and avoid doing this with production secrets.

Configure the ConnectionString  in Configuration tab of Device Explorer as shown and click on Update button.

Once ConnectionString is updated, please go to the Data tab in Device Explorer and select the Event Hub. Ideally, there should be an EventHub and the name would match with name of our IoT Hub.Please select that and then click on the Monitor button.You can now have a real time view of data being sent to the Azure IoT hub. These details are pulled from the Azure IoT hub and is displayed in local  machine on which DeviceExplorer runs.

There you go.We have been successful in sending data from our NodeMCU to Azure IoT hub.Let’s see what the code does.

This code is  modified  version of the samples that comes along with Azure IoT hub C SDK  (here). My friends in the Azure IoT team have done some excellent documentation on Azure IoT C SDK and I strongly recommend you to read those. Please don’t miss “Use the IoTHub Client” and “Use the serializer” section as well.This link has some portion of code which we are using.These are the few AzureIoT libraries which you need to import: AzureIoTHub,AzureIoTProtocol_MQTT and AzureIoTUtility.Since we use JSON in this code we have ArduinoJson library which simplify JSON handling for us.

First, you need to create a handle using IoTHubClient_LL_CreateFromConnectionString method which  initializes the library.Then you call readMessage and passes a variable messagecount and a char array.We then call our wrapper sendMessage passing the handle and message we need to send.

In our wrapper we create a messagehandle and then use IoTHubClient_LL_SendEventAsync to send these data to Azure IoT.We also register a callback function sendCallback to be called when data is successfully sent.In this example we aren’t using serializer library .

We need to pass the protocol  in IoTHubClient_LL_CreateFromConnectionString method which is MQTT in this case.AQMP and HTTPS is also supported.

A few things to note here with C SDK, for Azure IoT hub. There are two sets of API methods one with LL and one without LL (low level).The corresponding methods are :

  • IoTHubClient_LL_SendEventAsync
  • IoTHubClient_LL_SetMessageCallback
  • IoTHubClient_LL_Destroy
  • IoTHubClient_LL_DoWork

When we use the method in LL API’s we have explicit control over network transmissions.To actually send the data we use IoTHubClient_LL_DoWork method  while IoTHubClient_LL_SendEventAsync just places the message in a buffer.Last but  not the least like with any native code don’t forget to clean up resources using IoTHubMessage_Destroy method.

This is what documented in the doc

“When you call IoTHubClient_CreateFromConnectionString, the underlying libraries create a new thread that runs in the background. This thread sends events to, and receives messages from, IoT Hub. No such thread is created when working with the LL APIs. The creation of the background thread is a convenience to the developer. You don’t have to worry about explicitly sending events and receiving messages from IoT Hub — it happens automatically in the background. In contrast, the LL APIs give you explicit control over communication with IoT Hub, if you need it“.

I haven’t tested non LL methods in NodeMCU.My assumption is multiple threads doesn’t make sense in ESP as long as it’s single core or have a single core of computing.I may be wrong and don’t know if anyone has ported it to FreeRTOS to support multi-threading yet .