Resources

Renesas IoT Sandbox Documentation

Workflow Studio


Introduction


Workflow Studio is a tool on the Renesas IoT Sandbox platform that provides a simple, easy-to-use way for you to design and build your real-time workflows. It connects your data to your code in a very simple and visual way. It also serves as the portfolio to manage all your Workflows, allowing you to easily organize your workflows by function.

Workflow Studio - Workflow Organization

It consists of three major components that allow you to bring your sensor data to the cloud (Tags and Triggers), process and analyze that data (Programmatic Modules), and provide meaningful action (Outputs). We will cover each component in greater detail in this document.


Workflow Studio - 3 Major Components

Note: If you want a step-by-step guide on how to build and use a workflow, check out the Getting Started Guide.


Workflow Studio System Overview


Each workflow performs a specific task with specific data, and can output further events, which in turn can be used by other workflows. By sending an event with a specific tag, in a specific stream, a Workflow can trigger the execution of another Workflow which uses the tag as input. A Workflow can also communicate with other Workflows by storing state data using the Store library. Finally, a Workflow can communicate outward with the world, using the various notifications libraries: Email, SMS, iOS and Google Notification for mobile, and MQTT protocol to send data back to an embedded device.


Workflow Studio System Diagram


Now let's take a look at all the components of the Workflow Studio.


Workflow Studio - Toolbar


Tags & Triggers


The first component of a workflow are the Tags & Schedulers. These allow you to signify when you want your workflow to run. Workflows can be triggered based on when data arrives with a specific tag or on a timed schedule. An example would be the temperature tag from the raw stream. It would look like this:


This box connects to your Code module and signifies that your workflow will be triggered whenever data is sent with a ‘temperature’ key in the raw stream, regardless of the value.


The star on the front of the tag name indicates that the block is a trigger. You can change this state by clicking on the trigger checkbox to disable it. When disabled, the data for this tag will still be provided to the Code module for you to use, but incoming data for this tag will not trigger the it for execution.


Schedulers are other types of triggers based on a timer, such as Daily (i.e. Daily at 12AM), Hourly and Custom (i.e. every 5 minutes). The Schedulers have settings that may need to be set by double-clicking the block before it can be used. In an example of a scheduled trigger, you can specify which users the trigger will affect as well as the time zone.


Programmatic Modules


The second component of a workflow is the Module. This is the main component of a workflow and processes the data.


To do your own python customization, you can use the ‘Base Python’ module listed under the ‘Foundation’.


When double-clicked, you will find a default script that sets the output as the input from your trigger:

# set output to input
IONode.set_output('out1', {"value": IONode.get_input('in1')['event_data']['value']})

Note: This would cause an error if your trigger is a scheduled timer since there is no input.


The following code:
IONode.get_input('in1')['event_data']['value']

calls the IONode class that references the inputs and outputs, requests the input 'in1' (the default name of the 1st input node). This returns an object that contains the data as well as meta data, of the form:
IONode.get_input('in1')

returns: {'tag_name': 'raw.value', 'event_data': {'value': 'data'}, 'trigger': True, u'observed_at': '2016-09-29T23:45:10Z'}


The call to IONode.get_input('in1')['event_data']['value'] therefore returns the data only for the tag.



The default script is just to get you started but you can add your own customization. These are some basic functions that you can also use in your Base Python Modules:


log(value) OR print value    Returns a value in your debug logs
Note: Debug log must be enabled to capture this


escape() Exits a Workflow


In order to see the output of log() or print, you will need to activate the debugger (see Debugging below). A Code module can take multiple inputs and outputs. Select the number of inputs and outputs needed with Inputs/Outputs:

Input data can then be referred to with the IONode.get_input('') call.

There are also workflow shortcut keys that you can use in Workflow Studio:

CTRL + S     Save


CTRL + C Copy


CTRL + V Paste


CTRL + Z Undo


CTRL + / Comment/Uncomment



In addition to the standard Python built-ins, Renesas IoT Sandbox also has pre-programmed libraries that provide ready-to-use rich features and services. You can find more details on libraries and how to use them here. We are constantly updating and expanding our libraries, so check back for new features!



Outputs


The last component of a workflow is the Output. The Output is optional and only if you want to produce a processed event in the Renesas IoT Sandbox platform. An example where you wouldn’t need an Output would be sending the processed data as an email to yourself. However, in many circumstances, you may want to create a new event in one of your processed streams. You can find out more about streams here.


When you drag an Output onto your workflow and connect it to the Modules, your workflow will be expecting an output. To specify what the output event would be, you would set this in the Module of your workflow in this format:

IONode.set_output('out1', {"value": VALUE}

where VALUE can be a variable or static string or number.

In the Output side pane, you can find two types of outputs: single and multiple.

Single means one event is output whereas Multiple means several events are output. To do this, you would create a list in your Module and append to it. To set the output, you would use:

IONode.set_output_list('out1', outputMsgList)

where outputMsgList is a list of events.

Here is an example of how to use Processed Stream - Multiple:

outputMsgList = []
outputMsgList.append({"tagname":1})
outputMsgList.append({"tagname":2})
outputMsgList.append({"tagname":3})
IONode.set_output_list('out1', outputMsgList)

This will return three separate events.


Activation


So now that you’ve created a workflow, you want to activate it. Activating a workflow simply means to make it ‘live’. Immediately after activation, any event sent in with the tag trigger will go through the data processing in the Module.

To activate your workflow, go to the Revisions pane and click the checkmark icon . Afterwards, it should look like this:


Revisions


As you are creating your workflow, you may want to go back to a previous version. This is where Revisions comes into place. On the Revisions pane, you can see all the revisions you have made, auto-saved after every change you make. Here, you can see different options for each revision:

Restore - To go back to a previous version as a working copy. This will create a new revision with the same configuration as the restored one.

Activate - To make a workflow go live.

Deactivate - To make a workflow no longer be live.

Favorite - To save a workflow as a favorite. The revision then cannot be deleted and it will always show up on the revisions pane. A revision is automatically favorited upon activation. To unfavorite, click the icon again. The maximum number of favorited revisions is 20.

Delete - To delete a revision.

Clone - To clone a revision to a new workflow.

In addition to those options, You can also rename a revision by clicking on the default name “Autosave-x”.


Debugging


After finishing your workflow, you may want to test it to make sure it is working as expected. To do this, open the debugger pane.


The Debugger tool allows you to simulate sending events and viewing the output to verify if it is correct. Here is an standard example of how to test your workflow:

Step 1: Make sure the correct revision of your workflow is activated.

Step 2: Turn on Debugger by switching it on. Note: This now uses 1 workflow credit every time the workflow is triggered.

Step 3: Simulate the event. Select the stream and user you want to send the event to, then specify the event in JSON format.

{"tagname": "data"}

Push the Send button to send the event. This does not use any event credits but it uses a workflow credit.

Step 4: View the event. After sending the simulated event, hit the refresh button next to “Last 10 logs” . You should see a new line show up:

Click “display” and the input and output values should be shown on the Workflow Studio canvas. You can also click the timestamp for a popup with your debug log.

Step 5: Analyze results. If the output is what you expected, then you are done. If not, you may need to do a few tweaks then try the debugger again.


Statistics

The Statistics panel is a useful tool to measure workflow and credit usage. It shows information about your workflow over the past 24 hours and past 30 daily periods.


The top row, under “Daily Summary”, gives a quick overview of the statistics. It shows the total executions, the average execution time, the workflow credits used, and the number of errors in the past 24 hours.



Under Usage Analytics, you can see the statistics for the past 24 hours divided up per hour. The Module Occurrences graph is different from the other ones because it shows the total number of times a module in that workflow ran in the past twenty four hours.



Under the "Last 24 Hours" Graphs are the "Last 30 Daily Periods". You can see the statistics for the last 30 daily periods divided up by period (24 hour blocks since you created the account).


Conclusion

With the capabilities of the Workflow Studio, building customized workflows is easy and the possibilities are endless.


Getting Started
User Roles & Permissions
Streams, Tags, & Data Types
Dashboard Widgets
API
Workflow Studio
Workflow Libraries
Metering
Special Characters
Third Party Integration
Mobile
API Explorer

Workflow Studio


Introduction


Workflow Studio is a tool on the Renesas IoT Sandbox platform that provides a simple, easy-to-use way for you to design and build your real-time workflows. It connects your data to your code in a very simple and visual way. It also serves as the portfolio to manage all your Workflows, allowing you to easily organize your workflows by function.

Workflow Studio - Workflow Organization

It consists of three major components that allow you to bring your sensor data to the cloud (Tags and Triggers), process and analyze that data (Programmatic Modules), and provide meaningful action (Outputs). We will cover each component in greater detail in this document.


Workflow Studio - 3 Major Components

Note: If you want a step-by-step guide on how to build and use a workflow, check out the Getting Started Guide.


Workflow Studio System Overview


Each workflow performs a specific task with specific data, and can output further events, which in turn can be used by other workflows. By sending an event with a specific tag, in a specific stream, a Workflow can trigger the execution of another Workflow which uses the tag as input. A Workflow can also communicate with other Workflows by storing state data using the Store library. Finally, a Workflow can communicate outward with the world, using the various notifications libraries: Email, SMS, iOS and Google Notification for mobile, and MQTT protocol to send data back to an embedded device.


Workflow Studio System Diagram


Now let's take a look at all the components of the Workflow Studio.


Workflow Studio - Toolbar


Tags & Triggers


The first component of a workflow are the Tags & Schedulers. These allow you to signify when you want your workflow to run. Workflows can be triggered based on when data arrives with a specific tag or on a timed schedule. An example would be the temperature tag from the raw stream. It would look like this:


This box connects to your Code module and signifies that your workflow will be triggered whenever data is sent with a ‘temperature’ key in the raw stream, regardless of the value.


The star on the front of the tag name indicates that the block is a trigger. You can change this state by clicking on the trigger checkbox to disable it. When disabled, the data for this tag will still be provided to the Code module for you to use, but incoming data for this tag will not trigger the it for execution.


Schedulers are other types of triggers based on a timer, such as Daily (i.e. Daily at 12AM), Hourly and Custom (i.e. every 5 minutes). The Schedulers have settings that may need to be set by double-clicking the block before it can be used. In an example of a scheduled trigger, you can specify which users the trigger will affect as well as the time zone.


Programmatic Modules


The second component of a workflow is the Module. This is the main component of a workflow and processes the data.


To do your own python customization, you can use the ‘Base Python’ module listed under the ‘Foundation’.


When double-clicked, you will find a default script that sets the output as the input from your trigger:

# set output to input
IONode.set_output('out1', {"value": IONode.get_input('in1')['event_data']['value']})

Note: This would cause an error if your trigger is a scheduled timer since there is no input.


The following code:
IONode.get_input('in1')['event_data']['value']

calls the IONode class that references the inputs and outputs, requests the input 'in1' (the default name of the 1st input node). This returns an object that contains the data as well as meta data, of the form:
IONode.get_input('in1')

returns: {'tag_name': 'raw.value', 'event_data': {'value': 'data'}, 'trigger': True, u'observed_at': '2016-09-29T23:45:10Z'}


The call to IONode.get_input('in1')['event_data']['value'] therefore returns the data only for the tag.



The default script is just to get you started but you can add your own customization. These are some basic functions that you can also use in your Base Python Modules:


log(value) OR print value    Returns a value in your debug logs
Note: Debug log must be enabled to capture this


escape() Exits a Workflow


In order to see the output of log() or print, you will need to activate the debugger (see Debugging below). A Code module can take multiple inputs and outputs. Select the number of inputs and outputs needed with Inputs/Outputs:

Input data can then be referred to with the IONode.get_input('') call.

There are also workflow shortcut keys that you can use in Workflow Studio:

CTRL + S     Save


CTRL + C Copy


CTRL + V Paste


CTRL + Z Undo


CTRL + / Comment/Uncomment



In addition to the standard Python built-ins, Renesas IoT Sandbox also has pre-programmed libraries that provide ready-to-use rich features and services. You can find more details on libraries and how to use them here. We are constantly updating and expanding our libraries, so check back for new features!



Outputs


The last component of a workflow is the Output. The Output is optional and only if you want to produce a processed event in the Renesas IoT Sandbox platform. An example where you wouldn’t need an Output would be sending the processed data as an email to yourself. However, in many circumstances, you may want to create a new event in one of your processed streams. You can find out more about streams here.


When you drag an Output onto your workflow and connect it to the Modules, your workflow will be expecting an output. To specify what the output event would be, you would set this in the Module of your workflow in this format:

IONode.set_output('out1', {"value": VALUE}

where VALUE can be a variable or static string or number.

In the Output side pane, you can find two types of outputs: single and multiple.

Single means one event is output whereas Multiple means several events are output. To do this, you would create a list in your Module and append to it. To set the output, you would use:

IONode.set_output_list('out1', outputMsgList)

where outputMsgList is a list of events.

Here is an example of how to use Processed Stream - Multiple:

outputMsgList = []
outputMsgList.append({"tagname":1})
outputMsgList.append({"tagname":2})
outputMsgList.append({"tagname":3})
IONode.set_output_list('out1', outputMsgList)

This will return three separate events.


Activation


So now that you’ve created a workflow, you want to activate it. Activating a workflow simply means to make it ‘live’. Immediately after activation, any event sent in with the tag trigger will go through the data processing in the Module.

To activate your workflow, go to the Revisions pane and click the checkmark icon . Afterwards, it should look like this:


Revisions


As you are creating your workflow, you may want to go back to a previous version. This is where Revisions comes into place. On the Revisions pane, you can see all the revisions you have made, auto-saved after every change you make. Here, you can see different options for each revision:

Restore - To go back to a previous version as a working copy. This will create a new revision with the same configuration as the restored one.

Activate - To make a workflow go live.

Deactivate - To make a workflow no longer be live.

Favorite - To save a workflow as a favorite. The revision then cannot be deleted and it will always show up on the revisions pane. A revision is automatically favorited upon activation. To unfavorite, click the icon again. The maximum number of favorited revisions is 20.

Delete - To delete a revision.

Clone - To clone a revision to a new workflow.

In addition to those options, You can also rename a revision by clicking on the default name “Autosave-x”.


Debugging


After finishing your workflow, you may want to test it to make sure it is working as expected. To do this, open the debugger pane.


The Debugger tool allows you to simulate sending events and viewing the output to verify if it is correct. Here is an standard example of how to test your workflow:

Step 1: Make sure the correct revision of your workflow is activated.

Step 2: Turn on Debugger by switching it on. Note: This now uses 1 workflow credit every time the workflow is triggered.

Step 3: Simulate the event. Select the stream and user you want to send the event to, then specify the event in JSON format.

{"tagname": "data"}

Push the Send button to send the event. This does not use any event credits but it uses a workflow credit.

Step 4: View the event. After sending the simulated event, hit the refresh button next to “Last 10 logs” . You should see a new line show up:

Click “display” and the input and output values should be shown on the Workflow Studio canvas. You can also click the timestamp for a popup with your debug log.

Step 5: Analyze results. If the output is what you expected, then you are done. If not, you may need to do a few tweaks then try the debugger again.


Statistics

The Statistics panel is a useful tool to measure workflow and credit usage. It shows information about your workflow over the past 24 hours and past 30 daily periods.


The top row, under “Daily Summary”, gives a quick overview of the statistics. It shows the total executions, the average execution time, the workflow credits used, and the number of errors in the past 24 hours.



Under Usage Analytics, you can see the statistics for the past 24 hours divided up per hour. The Module Occurrences graph is different from the other ones because it shows the total number of times a module in that workflow ran in the past twenty four hours.



Under the "Last 24 Hours" Graphs are the "Last 30 Daily Periods". You can see the statistics for the last 30 daily periods divided up by period (24 hour blocks since you created the account).


Conclusion

With the capabilities of the Workflow Studio, building customized workflows is easy and the possibilities are endless.