You can set up automated multistep processes to route and transform your data so it’s ready to be consumed by an App or a Dashboard.
This is called a data workflow. The automated actions are managed by Robots, and the access to transformed data is granted with Rules.
In this article we will describe the usual actions you can perform with Robots to get your data workflow ready.
For an introduction to robots take a look at the Robot Guide
Read data from the data lake
Your workflow will usually start with reading data that was pushed to the data lake. This can be triggered automatically if your Robot was listening to a data stream recType, or periodically with a scheduled Robot.
Below are code snippets to read data from the data lake.
Read record triggering the Robot
Parse the message that triggers the Robot using lib.parseMsg to get the data and metadata from the ‘text’ parameter:
And the returned data model is
Read any record(s)
You can also read any record from which you know the recType and/or id.
This call will always return an array and depending on the read, you may have 0 or more records returned.
To get records by the recType and id:
To get records by the recType and associated tags:
Data lake advanced queries
You can use VIEWS to do advanced queries to the data lake.
Views run an aggregation query on the data lake entries and can take parameters. The returned format is the same as the read.
IMPORTANT: Your View must have the same recType as the one triggering the Robot !!!
Ideal for separating different sensor records coming from the same stream of IoT packages.
For example if you have a single platform pushing records about 2 different devices through the same pipe (same recType), you might want to parse by devEUI after.
The example below shows how one Robot can parse two devEUI.
Another strategy would be to use one Robot per devEUI, that can be turned on and off to start or stop 2 data flows.
Robots are running JS scripts, so any data trnsformation logic can be added to your workflow.
A usual transformation step of your workflow is the decoding of your IoT Payload. See our list of decoding methods here.
IMPORTANT: The Robot scripts use ES6 only, so you can’t rely on your usual browser APIs for data transformation
Below is an example of Payload decoding, ending up in writing a new record in the datalake:
Data formatting in a Robot allows you to identify interesting records, and format the data to be used in your report, App or Dashboard.
Below is an example of a Robot passing only high CO2 records.
External services triggering
You have access to notifications libraries, and RESTful POST and GET from a Robot script. This allows you to call external services from your data workflow.
For example, at Microshare.io we like to log on our slack channel, below is an example about how to do just that.
5 The Rule would point to the View’s recType, allow Execute operation, with the Requestor Organization set to &
You have now access to decoded IoT data through the Microshare API. This allows you to build whatever view you want with your favorite tools: web Apps, mobile Apps, Dashboards, etc. Unleash the data, and let your imagination go wild!