Atmospheric conditions continue to deteriorate each year due to the growth of civilization and increasing unclean emissions from industries and automobiles. Although air is an indispensable resource for life, many people are indifferent to the severity of air pollution or have only recently recognized the problem. Among various types of pollutants such as water, soil, thermal, and noise, air pollution is the most dangerous and severe, causing climate change and life-threatening diseases. According to the World Health Organization (WHO), 90 percent of the population now breathes polluted air, and air pollution is the cause of death for 7 million people every year. The health effects of pollution are very severe that cause stroke, lung cancer, and heart disease. Furthermore, air pollutants have a negative impact on humans and the earth’s ecosystem.
According to the United States Environmental Protection Agency (EPA), indoor air is 100 times more contaminated than outside air. Most modern populations spend 80 to 90 percent of their time indoors; in this COVID-19 pandemic, the rate is even higher and the situation is the same for all the people around the world. Therefore, indoor air has a greater direct impact on human health than outside air. Moreover, in contrast to atmospheric pollution, indoor pollutants are about 1000 times more likely to be transmitted to the lungs, causing diseases such as sick building syndrome, multiple chemical sensitivities, and dizziness.
A toilet is one of the public facilities, which is frequently used by people and located indoor. Therefore, maintaining good air quality in toilets is essential in order to keep them hygienic and sanitary. This is aligned with the statement mentioned by Wilke, ”In order to create a healthier and safer environment, the first step is in the washroom.” A toilet that has terrible air quality can be an ideal place for microbes, airborne bacteria, and recently added COVID-19 to disseminate.
My Healthy Toilet Project and the MotivationInsufficient ventilation, a high influx of people, and improper management of public toilets are the main sources of indoor air contamination in public toilets. To reduce exposure to air contamination, new measures can be taken, including the development of air quality measuring devices, continuous monitoring of the air quality data, and taking necessary steps based on the data.
Internet of Things (IoT) and cloud computing have revealed new capabilities of real-time monitoring in various fields. Since the technologies feature a wireless sensor network to automatically transmit, process, analyze, and visualize data, merging these new technologies can also offergreat advantages to improve indoor air quality.
User's satisfaction in using the restrooms must be measured to have proper management in the toilet. Normally, a toilet is cleaned at a fixed interval but sometimes a toilet needs cleaning even if it is not yet the cleaning schedule because it gives discomfort to the users. Analyzing air quality data, this situation can be detected when a toilet required cleaning.
Ventilation can increase comfort and improves indoor air quality (IAQ) by reducing the concentrations of indoor air pollutants and viruses like COVID-19. But electrical power consumption is involved with the ventilation and efficient and intelligent operation is required to save energy.
This project is significant because it helps to maintain the cleanliness and excellent air quality in public/private toilets by notifying the toilet attendants or owner when it is time for cleaning. It also intelligently operates the exhaust fan based on the air quality data to maintain a comfortable environment inside the toilet keeping energy consumption minimum. Cloud-based machine learning is used to analyze data and take decisions.
Features of Healthy Toilet Project- Securely publish important air quality data (temperature, humidity, ammonia, sulfur dioxide, carbon monoxide, nitrogen dioxide, methane, and noise level) to AWS.
- Analyze air quality data and based on the parameters the exhaust fan is controlled wirelessly to maintain a comfortable and healthy environment inside the toilet and it does it very efficiently consuming minimum energy.
- Using machine learning, data is further analyzed to continuously monitor if the exhaust fan is capable to maintain a healthy environment. If pollution goes at such a high level that exhaust fan and ventilation fail to maintain comfort and hygiene a manual cleaning request is sent to the email of the cleaning person or authority.
- Applying machine learning on the noise level with other parameters, water leakage is detected and sends a notification to the owner.
- Base on the condition of the inside air the system produces a color signal so that the user can know either the toilet is safe to use or not. Greenlight means the toilet environment is safe and comfortable to use. Yellow light means air is not good enough and the user should wait until the exhaust fan returns it back to a better condition. Red light means the toilet is not usable until a person cleans it.
The complete block diagram of the project is given below. All the sensors are
connected to Core2 EduKit through Port A and Port B. Core2 EduKit reads the sensors and sends all the sensor's values to AWS IoT Core using MQTT. An IoT rule forwards the data to IoT Analytics. IoT Analytics preprocess the data, store the data to AWS S3, and prepares a dataset from the data. AWS SageMaker uses the dataset and builds a machine learning model to determine the toilet status (either it is clean or needs to clean) from the future data published by Core2. A separate rule invokes a Lambda function to read the prediction for the new air quality data using the ML model. The rule then updates the device shadow based on the predicted result and the device gets notified back about the current status.
Another IoT rule is used to send email/SMS notifications to the user using AWS SNS if cleaning of the toilet is required or there is any water leakage. Lambda function is used to format the email from the raw JSON data.
By following the same procedure, water leakage is detected from the noise data and the owner is notified through email/SMS.
A dedicated IoT rule is used to determine the exhaust fan status base on the air quality data and IoT Core sends the data to a separate topic set for the exhaust fan. We got a basic idea of how the project works now,
Let's Make ItConnecting the HardwareThe main hardware is the Core2 EduKit and the sensors. I used a DHT11 sensor for collecting temperature & humidity and a MICS6814 sensor for sensing Carbon Monoxide, Nitrogen Dioxide, Ammonia, and Methane. This sensor provides data through three separate analog channels. For measuring Sulfur Dioxide I used 2SH12 sensor and it also provides analog data. So for connecting these two sensors to Core2 Kit I used a 4 channel 16 bit I2C ADC (ADS1115) module. I connected this module to Port A of Core2 Kit. This port supports I2C
communication and is Grove compatible. So, I used a Grove cable to connect the sensor module to Core2.
DHT11 sensor provides digital data. So, I connected this sensor to GPIO 26 pin of Port B through a Grove cable. If you use the Grove DHT11 module you need to alternate the signal cable with NC cable to make it connected to GPIO 26.
To connect the MISC6814 and 2SH12 with the ADS1115 I used a piece of perf board and soldered three male to female pin header to the perf board. To connect
this PCB sensor board to Core2 EduKit I soldered a Grove cable to the PCB maintaining I2C wiring sequence of the Grove I2C port. Finally, I put the DHT11
sensor to Port B and I2C sensors to Port A of the Core2 Kit as shown in the image below. If you notice the PCB you will find three resistors are soldered in the PCB.
These resistors are required for the MICS6814 sensor and you will find the details of the sensor and the required value calculation for resistors in MICS6814.odt file attached in the code section. For detailed connections please see the schematic below.
For wirelessly controlling the exhaust fan I used Node MCU and a relay board. Node MCU will receive the fan state from AWS IoT Core through an MQTT topic and turns ON or OFF the relay according to the state. Realy is used to control a high-power exhaust fan through a low voltage signal from Node MCU. The following Fritzing sketch shows the connection.
I had chosen some important sensors for measuring the air quality inside the toilet. The parameters that are important for a toilet are ammonia, sulfur dioxide, carbon dioxide, methane, temperature, humidity, etc.
Sulfur dioxide (SO2): Sulfur dioxide is a colorless, reactive air pollutant with a strong odor. This gas can be a threat to human health, animal health, and plant life. Sulfur dioxide irritates the skin and mucous membranes of the eyes, nose, throat, and lungs. High concentrations of SO2 can cause inflammation and irritation of the respiratory system, especially during heavy physical activity. The resulting symptoms can include pain when taking a deep breath, coughing, throat irritation, and breathing difficulties. High concentrations of SO2 can affect lung function, worsen asthma attacks, and worsen existing heart disease in sensitive groups. This gas can also react with other chemicals in the air and change to a small particle that can get into the lungs and cause similar health effects. More than 1.0 ppm is considered unhealthy.
Ammonia (NH3): Ammonia is a colorless gas with a strong, sharp, irritating odor. It is often used in water solutions, fertilizers, refrigerants, textiles, etc. Ammonia can affect you when inhaled. Contact can severely irritate and burn the skin and eyes with possible eye damage. Inhaling ammonia can irritate the nose, throat, and lungs. Repeated exposure may cause an asthma-like allergy and lead to lung damage. Over 25 ppm can create the above symptoms.
Carbon monoxide (CO): Carbon monoxide (CO) is a gas that’s both odorless and colorless. It’s found in combustion (exhaust) fumes produced by heaters, fireplaces, car mufflers, space heaters, charcoal grills, car engines, etc. Everyone is exposed to small amounts of carbon monoxide throughout the day. However, inhaling too much of it can cause CO poisoning.CO can increase to dangerous levels when combustion fumes become trapped in a poorly ventilated or enclosed space (such as a garage). Inhaling these fumes causes CO to build up in your bloodstream, which can lead to severe tissue damage. CO poisoning is extremely serious and can be life-threatening. If you breathe in large amounts of CO, your body will begin to replace the oxygen in your blood with CO. When this occurs, you can become unconscious. Death may occur in these cases. Once CO levels increase to 70 parts per million (ppm) and above, symptoms become more noticeable. These symptoms may include nausea, dizziness, and unconsciousness.
CO2(carbon dioxide), the byproduct of respiration, is very different in its biological effects from CO (carbon monoxide), a byproduct of the combustion of hydrocarbons, such as gas stoves and gas or oil boilers. CO (but not CO2) can rapidly accumulate in a poorly ventilated home, and it is deadly. That’s why CO (carbon monoxide) monitoring is recommended for indoor conditions.
Nitrogen dioxide (NO2): NO2, is one of the most familiar and well-studied pollutants. Exposure to NO2 — even small increases in short-term exposure — exacerbates respiratory problems, particularly asthma, and particularly in children. Long-term NO2 exposure to “cardiovascular effects, diabetes, poorer birth outcomes, premature mortality, and cancer.” Research has linked ongoing NO2 exposure to reduced cognitive performance, especially in children.
Developing Code for Core2 AWS EduKitAfter completing the hardware connection the next step is to develop the firmware for the Core2 AWS EduKit. To do so follow all the steps sequentially.
Step 1: Completing the Cloud Connected Blinky Example
Before further proceeding, you must complete the Cloud Connected Blinky example from the official link (https://edukit.workshop.aws/en/blinky-hello-world.html). The tutorial is very well explained and you should not face any difficulties if you have basic knowledge of embedded systems and AWS cloud. All of the steps and skills used there will provide the foundation to be successful in this tutorial. Specifically, you will:
- Install and configure the AWS CLI on your host machine to remotely manage your AWS services and use provided helper scripts.
- Register a “thing” in AWS IoT Core using the security certificates pre-provisioned on the onboard secure element.
- Configure your reference hardware to connect to Wi-Fi, connect to your account’s AWS IoT endpoint, and send MQTT messages to AWS IoT Core.
- Receive an MQTT message from the cloud on the specified reference hardware to trigger blinking an LED.
Step 2: Completing the Smart Thermostat Example
The next step is to open, compile and test the official Smart Thermostat Firmware. We will use this code as a template of our code. So, before starting to modify the code we want to be sure everything is working perfectly by running the firmware in our Core2 Kit. Before going to the next step I strongly recommend you complete the full tutorial from here (https://edukit.workshop.aws/en/smart-thermostat.html). It is also a very well-explained tutorial and easy to follow.
By the end of that tutorial, you will know:
- How to acquire temperature and sound levels from the Core2 for AWS IoT EduKit device.
- How MQTT works and how to publish temperature and sound measurements from the device to AWS IoT Core using MQTT.
- What is device shadow and how to report measured values to the device shadow?
- How to perform message transforms with the AWS IoT Core rules engine.
- How to build a serverless application that responds to inputs and detects complex events.
- How to send commands to your device via the device shadow.
Step 3: Developing the Firmware for Core2 AWS IoT EduKit
As I already said we will use Smart Thermostat firmware as the template for developing our own firmware. So, I am assuming that you already successfully compiled and tested that firmware. You can directly download and use my modified firmware from GitHub (link is attached in the code section) like the example program or you can customize the smart thermostat example by yourself following the steps below.
As we are going to use some external sensors we need to write code for interfacing those sensors. I developed (actually modified the existing Arduino library) two libraries for Core2 IoT EduKit. One for DHT11 temperature & humidity sensor and another for ADS1115 I2C analog to digital converter module. Follow the upcoming steps to know the process.
Step 3.1: Creating a Library
For creating a library we need to create two files. One is the C header file and we need to add it in the include sub-directory inside the main directory. Another is C source file needs to add to the root of the main directory. So, let's create a header file (.h) file first.
Step 3.1.1: Go to explorer -> main -> include and click on the right mouse button and choose New File
Put a name with a.h extension and hit enter from the keyboard. For example dht11.h
.
Step 3.1.2: Click on the file to open it and write or paste your code there.
For example, I added the following code for my dht11.h
header file.
#pragma once
#include <driver/gpio.h>
#include <esp_err.h>
#ifdef __cplusplus
extern "C" {
#endif
/**
* Sensor type
*/
typedef enum
{
DHT_TYPE_DHT11 = 0, //!< DHT11
DHT_TYPE_AM2301, //!< AM2301 (DHT21, DHT22, AM2302, AM2321)
DHT_TYPE_SI7021 //!< Itead Si7021
} dht_sensor_type_t;
esp_err_t dht_read_data(dht_sensor_type_t sensor_type, gpio_num_t pin,
int16_t *humidity, int16_t *temperature);
esp_err_t dht_read_float_data(dht_sensor_type_t sensor_type, gpio_num_t pin,
float *humidity, float *temperature);
#ifdef __cplusplus
}
#endif
Step 3.1.3:Save the file by pressing Ctrl+S. Our header file creation is completed. Now we will create the C source file for the library.
Step 3.1.4: Go to explorer -> main and click on the right mouse button and choose New File
Put a name with a.c extension and hit enter from the keyboard. For example dht11.c
.
Step 3.1.5: Click on the file to open it and write or paste your code in the editor.
For example, I used the following lines of code for my dht11.c
file
#include <freertos/FreeRTOS.h>
#include <freertos/task.h>
#include <esp_log.h>
#include <string.h>
#include <driver/gpio.h>
#include "dht.h"
// DHT timer precision in microseconds
#define DHT_TIMER_INTERVAL 2
#define DHT_DATA_BITS 40
#define DHT_DATA_BYTES (DHT_DATA_BITS / 8)
/*
* Note:
* A suitable pull-up resistor should be connected to the selected GPIO line
*
* __ ______ _______ ___________________________
* \ A / \ C / \ DHT duration_data_low / \
* \_______/ B \______/ D \__________________________/ DHT duration_data_high \__
*
*
* Initializing communications with the DHT requires four 'phases' as follows:
*
* Phase A - MCU pulls signal low for at least 18000 us
* Phase B - MCU allows signal to float back up and waits 20-40us for DHT to pull it low
* Phase C - DHT pulls signal low for ~80us
* Phase D - DHT lets signal float back up for ~80us
*
* After this, the DHT transmits its first bit by holding the signal low for 50us
* and then letting it float back high for a period of time that depends on the data bit.
* duration_data_high is shorter than 50us for a logic '0' and longer than 50us for logic '1'.
*
* There are a total of 40 data bits transmitted sequentially. These bits are read into a byte array
* of length 5. The first and third bytes are humidity (%) and temperature (C), respectively. Bytes 2 and 4
* are zero-filled and the fifth is a checksum such that:
*
* byte_5 == (byte_1 + byte_2 + byte_3 + byte_4) & 0xFF
*
*/
static const char *TAG = "DHT";
static portMUX_TYPE mux = portMUX_INITIALIZER_UNLOCKED;
#define PORT_ENTER_CRITICAL() portENTER_CRITICAL(&mux)
#define PORT_EXIT_CRITICAL() portEXIT_CRITICAL(&mux)
#define CHECK_ARG(VAL) do { if (!(VAL)) return ESP_ERR_INVALID_ARG; } while (0)
#define CHECK_LOGE(x, msg, ...) do { \
esp_err_t __; \
if ((__ = x) != ESP_OK) { \
PORT_EXIT_CRITICAL(); \
ESP_LOGE(TAG, msg, ## __VA_ARGS__); \
return __; \
} \
} while (0)
/**
* Wait specified time for pin to go to a specified state.
* If timeout is reached and pin doesn't go to a requested state
* false is returned.
* The elapsed time is returned in pointer 'duration' if it is not NULL.
*/
static esp_err_t dht_await_pin_state(gpio_num_t pin, uint32_t timeout,
int expected_pin_state, uint32_t *duration)
{
/* XXX dht_await_pin_state() should save pin direction and restore
* the direction before return. however, the SDK does not provide
* gpio_get_direction().
*/
gpio_set_direction(pin, GPIO_MODE_INPUT);
// Enabling pull-up is required if the sensor has no physical pull-up resistor
gpio_set_pull_mode(pin, GPIO_PULLUP_ONLY);
for (uint32_t i = 0; i < timeout; i += DHT_TIMER_INTERVAL)
{
// need to wait at least a single interval to prevent reading a jitter
ets_delay_us(DHT_TIMER_INTERVAL);
if (gpio_get_level(pin) == expected_pin_state)
{
if (duration)
*duration = i;
return ESP_OK;
}
}
return ESP_ERR_TIMEOUT;
}
/**
* Request data from DHT and read raw bit stream.
* The function call should be protected from task switching.
* Return false if error occurred.
*/
static inline esp_err_t dht_fetch_data(dht_sensor_type_t sensor_type, gpio_num_t pin, uint8_t data[DHT_DATA_BYTES])
{
uint32_t low_duration;
uint32_t high_duration;
// Phase 'A' pulling signal low to initiate read sequence
gpio_set_direction(pin, GPIO_MODE_OUTPUT_OD);
gpio_set_level(pin, 0);
ets_delay_us(sensor_type == DHT_TYPE_SI7021 ? 500 : 20000);
gpio_set_level(pin, 1);
// Step through Phase 'B', 40us
CHECK_LOGE(dht_await_pin_state(pin, 40, 0, NULL),
"Initialization error, problem in phase 'B'");
// Step through Phase 'C', 88us
CHECK_LOGE(dht_await_pin_state(pin, 88, 1, NULL),
"Initialization error, problem in phase 'C'");
// Step through Phase 'D', 88us
CHECK_LOGE(dht_await_pin_state(pin, 88, 0, NULL),
"Initialization error, problem in phase 'D'");
// Read in each of the 40 bits of data...
for (int i = 0; i < DHT_DATA_BITS; i++)
{
CHECK_LOGE(dht_await_pin_state(pin, 65, 1, &low_duration),
"LOW bit timeout");
CHECK_LOGE(dht_await_pin_state(pin, 75, 0, &high_duration),
"HIGH bit timeout");
uint8_t b = i / 8;
uint8_t m = i % 8;
if (!m)
data[b] = 0;
data[b] |= (high_duration > low_duration) << (7 - m);
}
return ESP_OK;
}
/**
* Pack two data bytes into single value and take into account sign bit.
*/
static inline int16_t dht_convert_data(dht_sensor_type_t sensor_type, uint8_t msb, uint8_t lsb)
{
int16_t data;
if (sensor_type == DHT_TYPE_DHT11)
{
data = msb * 10;
}
else
{
data = msb & 0x7F;
data <<= 8;
data |= lsb;
if (msb & BIT(7))
data = -data; // convert it to negative
}
return data;
}
esp_err_t dht_read_data(dht_sensor_type_t sensor_type, gpio_num_t pin,
int16_t *humidity, int16_t *temperature)
{
CHECK_ARG(humidity || temperature);
uint8_t data[DHT_DATA_BYTES] = { 0 };
gpio_set_direction(pin, GPIO_MODE_OUTPUT_OD);
gpio_set_level(pin, 1);
PORT_ENTER_CRITICAL();
esp_err_t result = dht_fetch_data(sensor_type, pin, data);
if (result == ESP_OK)
PORT_EXIT_CRITICAL();
/* restore GPIO direction because, after calling dht_fetch_data(), the
* GPIO direction mode changes */
gpio_set_direction(pin, GPIO_MODE_OUTPUT_OD);
gpio_set_level(pin, 1);
if (result != ESP_OK)
return result;
if (data[4] != ((data[0] + data[1] + data[2] + data[3]) & 0xFF))
{
ESP_LOGE(TAG, "Checksum failed, invalid data received from sensor");
return ESP_ERR_INVALID_CRC;
}
if (humidity)
*humidity = dht_convert_data(sensor_type, data[0], data[1]);
if (temperature)
*temperature = dht_convert_data(sensor_type, data[2], data[3]);
ESP_LOGD(TAG, "Sensor data: humidity=%d, temp=%d", *humidity, *temperature);
return ESP_OK;
}
esp_err_t dht_read_float_data(dht_sensor_type_t sensor_type, gpio_num_t pin,
float *humidity, float *temperature)
{
CHECK_ARG(humidity || temperature);
int16_t i_humidity, i_temp;
esp_err_t res = dht_read_data(sensor_type, pin, humidity ? &i_humidity : NULL, temperature ? &i_temp : NULL);
if (res != ESP_OK)
return res;
if (humidity)
*humidity = i_humidity / 10.0;
if (temperature)
*temperature = i_temp / 10.0;
return ESP_OK;
}
Step 3.1.6:Save the file by pressing Ctrl+S. Our library creation is completed.
Step 3.1.7: Add the newly created source file to CMakeList
. Click on CMakeLists.txt to open it and add the name of your library source file shown in the screenshot below.
Now, you are ready to use your library. Follow the same procedure to create more libraries as your need. For example ADS1115 library in our case. All the source files for the library are added in the code section as well as the GitHub repository.
Step 3.2: Testing the Library
Let's test our created dht11 library to know it is working. It will prove the working of our DHT11 sensor also.
Step 3.2.1: Open the main.c file, select the whole code using Ctrl+A, then cut the code using Ctrl+X, paste the code using Ctrl+V in a notepad text file and save it. We will use it again.
Step 3.2.2: Paste the following code snippet into the main.c file. This code will read the temperature & humidity from the dht11 sensor using the newly created dht11 library and print the result in the terminal.
#include <stdio.h>
#include "freertos/FreeRTOS.h"
#include "freertos/task.h"
#include "esp_log.h"
#include "esp_err.h"
#include "core2forAWS.h"
#include "dht.h"
static const dht_sensor_type_t sensor_type = DHT_TYPE_DHT11;
static const gpio_num_t dht_gpio = GPIO_NUM_26;
void temperature_task(void *arg) {
int16_t temperature = 0;
int16_t humidity = 0;
while (1)
{
if (dht_read_data(sensor_type, dht_gpio, &humidity, &temperature) == ESP_OK)
printf("Humidity: %d%% Temp: %dC\n", humidity / 10, temperature / 10);
else
printf("Could not read data from sensor\n");
vTaskDelay(pdMS_TO_TICKS(2000));
}
}
void app_main()
{
Core2ForAWS_Init();
xTaskCreatePinnedToCore(&temperature_task, "temperature_task", 4096, NULL, 5, NUL, 1);
}
Step 3.2.3: Open a new terminal and type pio run --environment core2foraws
to build the program.
You will get the success message if everything works.
Step 3.2.4: Upload the firmware to Core2 AWS IoT EduKit using the following command
pio run --environment core2foraws --target upload
Step 3.2.5: Connect DHT11 sensor to Port B of Core2 Kit and monitor the result using the following command
pio run --environment core2foraws --target monitor
If you get the following result from the terminal then Congratulation! Your library and DHT11 sensor both are working perfectly.
Step 3.3: Modifying the main.c file
Replace the new main.c source with the original main.c source code we stored in the notepad file. As we are going to publish more data than the demo project we need to increase JSON Buffer size as follows:
#define MAX_LENGTH_OF_UPDATE_JSON_BUFFER 300
The following variables are added for storing the air quality parameters.
float temperature = STARTING_ROOMTEMPERATURE;
float humidity = STARTING_ROOMHUMIDITY;
float nitrogen_dioxide = STARTING_ROOMNO2;
float ammonia = STARTING_ROOMNH3;
float carbon_monoxide = STARTING_ROOMCO;
float sulfur_dioxide = STARTING_ROOMSO2;
float methane = STARTING_ROOMCH4;
uint8_t soundBuffer = STARTING_SOUNDLEVEL;
uint8_t reportedSound = STARTING_SOUNDLEVEL;
bool fan_status = STARTING_FANSTATUS;
char toilet_status[14] = STARTING_TOILETSTATUS;
We need more handler variables of type jsonStruct_t
to pack all our sensor values. So following handler variables have been created.
jsonStruct_t temperatureHandler;
temperatureHandler.cb = NULL;
temperatureHandler.pKey = "temperature";
temperatureHandler.pData = &temperature;
temperatureHandler.type = SHADOW_JSON_FLOAT;
temperatureHandler.dataLength = sizeof(float);
jsonStruct_t humidityHandler;
humidityHandler.cb = NULL;
humidityHandler.pKey = "humidity";
humidityHandler.pData = &humidity;
humidityHandler.type = SHADOW_JSON_FLOAT;
humidityHandler.dataLength = sizeof(float);
jsonStruct_t carbonMonoxideHandler;
carbonMonoxideHandler.cb = NULL;
carbonMonoxideHandler.pKey = "carbon_monoxide";
carbonMonoxideHandler.pData = &carbon_monoxide;
carbonMonoxideHandler.type = SHADOW_JSON_FLOAT;
carbonMonoxideHandler.dataLength = sizeof(float);
jsonStruct_t ammoniaHandler;
ammoniaHandler.cb = NULL;
ammoniaHandler.pKey = "ammonia";
ammoniaHandler.pData = &ammonia;
ammoniaHandler.type = SHADOW_JSON_FLOAT;
ammoniaHandler.dataLength = sizeof(float);
jsonStruct_t nitrogenDioxideHandler;
nitrogenDioxideHandler.cb = NULL;
nitrogenDioxideHandler.pKey = "nitrogen_dioxide";
nitrogenDioxideHandler.pData = &nitrogen_dioxide;
nitrogenDioxideHandler.type = SHADOW_JSON_FLOAT;
nitrogenDioxideHandler.dataLength = sizeof(float);
jsonStruct_t sulfurDioxideHandler;
sulfurDioxideHandler.cb = NULL;
sulfurDioxideHandler.pKey = "sulfur_dioxide";
sulfurDioxideHandler.pData = &sulfur_dioxide;
sulfurDioxideHandler.type = SHADOW_JSON_FLOAT;
sulfurDioxideHandler.dataLength = sizeof(float);
jsonStruct_t methaneHandler;
methaneHandler.cb = NULL;
methaneHandler.pKey = "methane";
methaneHandler.pData = &methane;
methaneHandler.type = SHADOW_JSON_FLOAT;
methaneHandler.dataLength = sizeof(float);
jsonStruct_t soundHandler;
soundHandler.cb = NULL;
soundHandler.pKey = "sound";
soundHandler.pData = &reportedSound;
soundHandler.type = SHADOW_JSON_UINT8;
soundHandler.dataLength = sizeof(uint8_t);
jsonStruct_t exhaustFanActuator;
exhaustFanActuator.cb = exhaustFan_Callback;
exhaustFanActuator.pKey = "fan_status";
exhaustFanActuator.pData = &fan_status;
exhaustFanActuator.type = SHADOW_JSON_BOOL;
exhaustFanActuator.dataLength = sizeof(bool);
jsonStruct_t toiletStatusActuator;
toiletStatusActuator.cb = toilet_status_Callback;
toiletStatusActuator.pKey = "toilet_status";
toiletStatusActuator.pData = &toilet_status;
toiletStatusActuator.type = SHADOW_JSON_STRING;
toiletStatusActuator.dataLength = strlen(toilet_status)+1;
The following first function packs any values we want to publish to the cloud into the shadow document that is expected by the IoT Core shadow service. The second function actually publishes the marshaled shadow document as a payload over the network to IoT Core on the topic $aws/things/<<CLIENT_ID>>/shadow/update
where <<CLIENT_ID>>
is a unique id for every device. These two functions are modified as follows.
if(SUCCESS == rc) {
rc = aws_iot_shadow_add_reported(JsonDocumentBuffer,
sizeOfJsonDocumentBuffer, 10, &temperatureHandler,
&humidityHandler, &carbonMonoxideHandler,
&ammoniaHandler, &nitrogenDioxideHandler, &sulfurDioxideHandler,
&methaneHandler, &soundHandler, &toiletStatusActuator, &exhaustFanActuator);
if(SUCCESS == rc) {
rc = aws_iot_finalize_json_document(JsonDocumentBuffer,
sizeOfJsonDocumentBuffer);
if(SUCCESS == rc) {
ESP_LOGI(TAG, "Update Shadow: %s", JsonDocumentBuffer);
rc = aws_iot_shadow_update(&iotCoreClient, client_id, JsonDocumentBuffer,
ShadowUpdateStatusCallback, NULL, 10, true);
shadowUpdateInProgress = true;
}
}
}
The following code represents a callback function for our toilet status actuator. This is the code that is executed when a new message is received by the device that includes the state.desired.toilet_status
key-value pair. The toilet status is determined from the ML model based on the air quality data. Based on the status back by IoT Core, the color is changed.
void toilet_status_Callback(const char *pJsonString, uint32_t JsonStringDataLen,
jsonStruct_t *pContext) {
IOT_UNUSED(pJsonString);
IOT_UNUSED(JsonStringDataLen);
char * status = (char *) (pContext->pData);
if(pContext != NULL) {
ESP_LOGI(TAG, "Delta - toiletStatus state changed to %s", status);
}
if(strcmp(status, BUSY) == 0) {
ESP_LOGI(TAG, "setting side LEDs to Yellow");
Core2ForAWS_Sk6812_SetSideColor(SK6812_SIDE_LEFT, 0xFFFF00);
Core2ForAWS_Sk6812_SetSideColor(SK6812_SIDE_RIGHT, 0xFFFF00);
Core2ForAWS_Sk6812_Show();
} else if(strcmp(status, UNCLEAN) == 0) {
ESP_LOGI(TAG, "setting side LEDs to Red");
Core2ForAWS_Sk6812_SetSideColor(SK6812_SIDE_LEFT, 0xFF0000);
Core2ForAWS_Sk6812_SetSideColor(SK6812_SIDE_RIGHT, 0xFF0000);
Core2ForAWS_Sk6812_Show();
} else if(strcmp(status, READY) == 0) {
ESP_LOGI(TAG, "clearing side Green");
Core2ForAWS_Sk6812_SetSideColor(SK6812_SIDE_LEFT, 0x00FF00);
Core2ForAWS_Sk6812_SetSideColor(SK6812_SIDE_RIGHT, 0x00FF00);
//Core2ForAWS_Sk6812_Clear();
Core2ForAWS_Sk6812_Show();
}
}
For reading temperature and air quality following two functions are used.
void read_temperature(){
int16_t temperature_data = 0;
int16_t humidity_data = 0;
if (dht_read_data(sensor_type, dht_gpio, &humidity_data,
&temperature_data) == ESP_OK){
temperature = (float) temperature_data/10;
humidity = (float) humidity_data/10;
}
}
void read_airquality(){
int16_t adc0, adc1, adc2;
//float nitrogen_dioxide, ammonia, carbon_monoxide;
adc0 = ADS1115_readADC_SingleEnded(CO_CHNNEL);
carbon_monoxide = ADS1115_computeVolts(adc0);
adc1 = ADS1115_readADC_SingleEnded(NH3_CHNNEL);
ammonia = ADS1115_computeVolts(adc1);
adc2 = ADS1115_readADC_SingleEnded(NO2_CHNNEL);
nitrogen_dioxide = ADS1115_computeVolts(adc2);
}
The following function receives the ppm value of the air quality data into global variables.
void read_airquality_ppm(){
carbon_monoxide = measure_in_ppm(CO);
nitrogen_dioxide = measure_in_ppm(NO2);
ammonia = measure_in_ppm(NH3);
methane = measure_in_ppm(CH4);
}
The app_main() function is updated as follows
void app_main()
{
Core2ForAWS_Init();
Core2ForAWS_Display_SetBrightness(80);
Core2ForAWS_LED_Enable(1);
ADS1115_I2CInit();
ADS1115_setGain(GAIN_TWOTHIRDS);
airquality_calibrate ();
xMaxNoiseSemaphore = xSemaphoreCreateMutex();
ui_init();
initialise_wifi();
xTaskCreatePinnedToCore(&aws_iot_task, "aws_iot_task", 4096*2, NULL, 5, NULL, 1);
}
The complete main.c program is attached in the code section and in GitHub ripo.
Step 3.4: Testing the Firmware
After completing all the modifications build the program and upload it to Core2. Connect I2C sensors to Port A and DHT11 sensor to Port B and run the debug terminal to monitor the output.
If everything goes well you will get the following output from the terminal.
Open the AWS IoT Core console test page, subscribe to the topic $aws/things/<<CLIENT_ID>>/shadow/update/accepted
and you should see new messages arriving in time with your vTaskDelay(). (Replace «CLIENT_ID» with your device client Id/serial number printed on the screen.) The output is shown in the following image.
From the AWS IoT Core console test page, click on Publish to a topic tab and publish the following new shadow message on the topic $aws/things/<<CLIENT_ID>>/shadow/update
. You should see the Core for AWS IoT EduKit’s LED bars change from Green to Red. See below for a sample shadow message. Test the effects by changing the toilet_status (set to BUSY or READY) and/or fan_status values (set to true or false) each time you publish the message.
{ "state": { "desired": { "toilet_status": "UNCLEAN", "fan_status": true } } }
The effect will also appear in the terminal as follows.
Everything is well perfectly from the device side. Now we will configure AWS IoT Cloud to transform and route data receiving from the device.
Developing Machine Learning Model to automatically detect the Toilet Status (UNCLEAN, BUSY, READY)We are considering three states of a toilet for our project. One is the READY state means air quality data is good and the toilet is safe and comfortable to use. LEDs bar shows green light in that case. Another state is the BUSY state. When some air quality parameters are not good enough it can be uncomfortable for a user. Possible reasons can be the presence of higher humidity, odor, or bad smell. In this situation running an exhaust fan for few minutes can make the situation better. So, this is the state when using the toilet is not recommended and the system tries to make it better by taking some time and running the fan. Yellow light is enabled in this state. Another state is the UNCLEAN state when some air parameters are not in the tolerable limit and using the toilet in this situation can create health problems. Even only ventilation is not enough to make it better but a manual cleaning by a cleaner is required. The device shows red light in this state and sent a cleaning request by email to the concerned person.
The problem is that determination of these three states is not easy. There is no formula to find in which state the toilet has. Thanks to the Machine Learning. A simple machine learning model could help to identify a state intelligently from the classification of the air quality data. Again, training a machine learning model is not easy and building a good model requires expertise, a comprehensive understanding of inputs, and effort over many cycles to optimize it. But these days with the modern data science toolchain an ML model can be made without any ML expertise and the Amazon SageMaker toolchain is one of them, but we can’t expect that a good model will be produced on the first try.
For building a good model we need lots of data. The more the better. Building an ML model is a two-step process, collecting data and training the model. For accurately detecting, three states discussed before we need to collect lots of data for every situation. So, we need to deploy our device for several hours at least (preferably 24 hours) in a clean toilet, several hours in an uncomfortable toilet, and several hours in an unhygienic toilet to collect enough data for building an accurate ML model. The second, hands-off step will be when the ML model is going through an automated training process after you have gathered sufficient data. This training process can take several hours and we recommend starting it in the morning and returning to it in the afternoon, or letting it run overnight.
We will use an aggregated data set of our device telemetry to power an automated ML training experiment that will then be used to classify new device reports with a new toiletstate value inferred from the trained ML model.
The workflow that we will follow has the following key components:
- We will create a new IoT Core rule that will forward reported device data to a service called AWS IoT Analytics. The IoT Analytics service stores raw IoT data in bulk, transforms and cleans it to turn raw data into processed data, and provides a query engine to slice processed data for analytical workflows or ML training.
- Our IoT Analytics project will consist of four resources linked together: a channel for storing raw IoT data from the device shadow, a pipeline to transform/filter/enrich data, a data store for processed data, and a data set that runs saved queries and can send the results for processing.
- Amazon SageMaker Studio is an integrated machine learning environment where you can build, train, deploy, and analyze your models, all in the same application.
- An Amazon SageMaker endpoint that hosts your trained model as a consumable API.
- We will create an AWS Lambda function that will run some simple code to process published messages and make inferences against our new ML model endpoint.
Step 1: Creating an IoT Core rule to forward telemetry data to IoT Analytics
- i. In the AWS IoT Core console, choose Act then Rules then Create.
- ii. Give your rule a name like toiletDataToAnalyticsForState and a description.
- iii. Use the following query. Be sure to replace the «CLIENT_ID» with the client Id/serial number printed on the screen of your Core2 for AWS IoT Edukit reference hardware kit.
SELECT current.state.reported.temperature, current.state.reported.humidity, current.state.reported.carbon_monoxide, current.state.reported.ammonia, current.state.reported.nitrogen_dioxide, current.state.reported.sulfur_dioxide, current.state.reported.methane, current.state.reported.sound, current.state.reported.toilet_status, current.state.reported.fan_status, timestamp FROM '$aws/things/<<CLIENT_ID>>/shadow/update/documents'
- iv. Choose Add action for Set one or more actions.
- v. Select Send a message to IoT Analytics and choose Configure action.
- vi. Select Quick create IoT Analytics resources and provide a project name for Resource prefix. Further module steps assume the prefix is toiletAnalyticsResourceForState. Choose Quick Create and all your AWS IoT Analytics resources will be created and configured automatically.
- vii. Choose Add action to configure this action and return to the rule creation form.
- viii. Choose Create rule to create your new rule.
Validation steps
Before moving on to the next chapter, you can validate that your serverless application is configured as intended:
- Ensure that your Core2 device is powered on, publishing data, and deployed in the toilet you want to train on.
- Using the AWS IoT Analytics console, review the most recent data set contents and verify there are historical records of all the parameters like temperature, humidity, ammonia, sulfur dioxide, etc. with the timestamp. To check this, find your data set in the IoT Analytics console, choose Actions and Run now, then wait for the completion of the query. From the Content tab click on the recently created content preview with the latest content. You should see results similar to the following:
Step 2: Setting up Amazon SageMaker Studio for automatic model training
First, you will set up Amazon SageMaker Studio in order to configure a new experiment for automatic model training.
- i. Go to the Amazon SageMaker console and choose Amazon SageMaker Studio.
- ii. Select Quick start and optionally enter a new User name like toiletStateUser.
- iii. For Execution role select the drop-down and choose Create a new IAM role.
- iv. For S3 buckets you specify select None and then choose Create role. The other defaults for buckets with “sagemaker” in the name are sufficient for this project.
- v. Choose Submit to start the provisioning process of SageMaker Studio. This step will take a few minutes to complete on your behalf.
Step 3: Configuring a new project in SageMakerStudio
- i. From the SageMaker Studio Control Panel, choose Open Studio for the newly created user.
- ii. In the Launcher tab, choose New project.
- iii. Under SageMaker project templates select MLOps template for model building, training, and deployment then choose Select project template.
- iv. Give your project a name like toiletStateSageMakerProject and description, then choose Create project.
Once the project is created, we will get a project dashboard with tabs like Repositories, Pipelines, Experiments, and so on. Let's leave this browser tab open to SageMaker Studio so you can quickly return to this page.
Step 4: Exporting AWS IoT Analytics data
- i. Open the AWS IoT Analytics console and choose your data set (assumed name is toiletanalyticsresourceforstate_dataset).
- ii. Under Data set content delivery rules choose Edit.
- iii. Choose Add rule, then choose Deliver result to S3.
- iv. Under S3 bucket choose Please select a resource and find the S3 bucket created for your SageMaker Studio project. It will be named like
sagemaker-project-p-wzq6773hm0gv
. If there are multiple buckets named like this, you’ll need to check the SageMaker Studio project for the random hash ID of your project. You can see the hash in other resources of your project like the Repositories and Pipelines tabs.
- v. Under Bucket key expression use this expression:
data/healthytoilet/Version/!{iotanalytics:scheduleTime}_!{iotanalytics:versionId}.csv
- vi. Under Role choose Create new and provide a name for the IAM role that will grant IoT Analytics access to write data to your S3 bucket. Choose Create role.
- vii. Choose Save to finalize your new delivery rule.
- viii. To generate a data set that will be saved to your new Amazon S3 bucket for training, choose Actions then Run now. You should see the Result preview update when the data set content is done generating.
We are now ready to start your ML experiment back in SageMaker Studio. An experiment will use the reported thermostat data that was just exported by our IoT Analytics data set as inputs. We will configure the experiment to look for ways to accurately predict the existing toilet_status column. The automatic training job will analyze your data for relevant algorithms to try, then run 250 training jobs with varying hyperparameters, selecting the one that gives the best fit to your input training data.
Before starting your ML experiment, we should have several hours of data reported from our device in the toilet we are going to analyze, and in that time the toilet should have had a mix of ready, busy, and unclean situations. An automatic ML experiment needs at least 500 rows of data to work, but the more data we bring the better the result will be. If we still need to generate more data before proceeding, we neet to re-run the data set in the IoT Analytics console (last step of the previous instruction list) so that those results are available to SageMaker in our project S3 bucket.
Step 4: Starting ML Experiment in SageMaker Studio
- i. Return to your SageMaker Studio, open your project, select the Experiments tab and choose Create autopilot experiment.
- ii. Give your experiment a name.
- iii. Under Project select your project from the list.
- iv. Under Connect your data and S3 bucket name find and select your project’s S3 bucket in the list. This is the same one you selected for the IoT Analytics data set content delivery rule in the previous step.
- v. Under Dataset file name find and select your IoT Analytics dataset content like
data/healthytoilet/Version/1607276270943_3b4eb6bb-8533-4ac0-b8fd-1b62ac0020a2.csv
. - vi. Under Target choose
toilet_status
. - vii. Under Output data location and S3 bucket name find and choose the same project S3 bucket in this list that you picked in step 4.
- viii. Under Dataset directory name type in
output/healthytoilet
and choose Use input as S3 object key prefix “output/healthytoilet”. This defines a new prefix in the S3 bucket that will be used for your output files. - xiv. Choose Create Experiment to start the automated ML experiment.
Running the experiment might take minutes to hours. You can follow along with the experiment’s progress in the SageMaker Studio browser tab, but it is also safe to close the tab and come back later to check progress.
Once the experiment has concluded, the resultant output is 250 trials that SageMaker used to find the best tuning job parameters. Sort the table of trials to find the one marked Best. The next milestone is to deploy this trial as a model endpoint so that you can invoke it as an API.
Step 5: Deploying the Best ML Model
- i. Select the trial marked Best and choose Deploy model.
- ii. Give your endpoint a name. Further steps in this module assume the name healthyToiletStateEndpoint.
- iii. Under Inference Response Content, select both predicted_label and probability. predicted_label may already have been added to the list.
- iv. Choose Deploy model to tell SageMaker to deploy your model as a new consumable API endpoint. This will take several minutes.
Now your machine learning model is deployed as an API endpoint, managed by Amazon SageMaker. In the next chapter, Working with ML models, you will consume the API endpoint with a serverless function and replace the simple threshold logic in the IoT Core rule that determines the toilet_status
value with inferences generated by your model.
Before moving on to the next, you can validate that your serverless application is configured as intended…
- Use the Amazon SageMaker console to see your new endpoint with the status InService, on the Endpoints page.
Step 6: Setting up AWS Lambda and Invoking SageMaker Endpoint
The following steps will walk us through the creation of a serverless function in AWS Lambda. The function defines a small bit of code that expects device shadow messages from IoT Core, transforms the message into the format used with our ML endpoint, then invokes the ML endpoint to return the classification of toilet_status and the confidence score of the inference.
- i. From the AWS Lambda console, choose Create function.
- ii. Enter a name for your function. Further steps assume the name
classifyToiletStatus
. - iii. Under Runtime, select Python 3.8.
- iv. Choose Create function.
- v. Under Function code, in the file lambda_function.py, copy and paste the following code to replace the placeholder code:
import json
import boto3
import os
# Receives a device shadow Accepted document from IoT Core rules engine.
# Event has signature like {"state": {"reported": {"sound": 5}}}.
# See expectedAttributes for full list of attributes expected in state.reported.
# Builds CSV input to send to SageMaker endpoint, name of which stored in
# environment variable SAGEMAKER_ENDPOINT.
#
# Returns the prediction and confidence score from the ML model endpoint.
def lambda_handler(event, context):
client = boto3.client('sagemaker-runtime')
print('event received: {}'.format(event))
# Order of attributes must match order expected by ML model endpoint. E.g.
# the same order of columns used to train the model.
expectedAttributes = ['temperature', 'humidity', 'carbon_monoxide', 'ammonia', 'nitrogen_dioxide', 'sulfur_dioxide', 'methane', 'sound', 'toilet_status', 'fan_status', 'timestamp']
reported = event['state']['reported']
reported['timestamp'] = event['timestamp']
reportedAttributes = reported.keys()
# Validates the input event has all the expected attributes.
if(len(set(expectedAttributes) & set(reportedAttributes)) < len(expectedAttributes)):
return {
'statusCode': 400,
'body': 'Error: missing attributes from event. Expected: {}. Received: {}.'.format(','.join(expectedAttributes), ','.join(reportedAttributes))
}
# Build the input CSV string to send to the ML model endpoint.
reportedValues = []
for attr in expectedAttributes:
reportedValues.append(str(reported[attr]))
input = ','.join(reportedValues)
print('sending this input for inference: {}'.format(input))
endpoint_name = os.environ['SAGEMAKER_ENDPOINT']
content_type = "text/csv"
accept = "application/json"
payload = input
response = client.invoke_endpoint(
EndpointName=endpoint_name,
ContentType=content_type,
Accept=accept,
Body=payload
)
body = response['Body'].read()
print('received this response from inference endpoint: {}'.format(body))
return {
'statusCode': 200,
'body': json.loads(body)['predictions'][0]
}
- vi. Under the Configuration tab Click on Environment variables, and choose Edit and add a new environment variable.
- vii. For Key enter
SAGEMAKER_ENDPOINT
and for Value enter the name of your SageMaker endpoint. You named this resource as the last step of the Deploy Model and this module assumes the name ishealthyToiletStateEndpoint
.
- viii. Choose Save to commit this new environment variable and return to the main Lambda editor interface.
- ix. In the Designer panel, choose + Add trigger.
- x. For Trigger configuration select AWS IoT from the list.
- xi. For IoT type, select Custom IoT rule.
- xii. For Rule, choose Create a new rule, give a role name like toiletIotLambdaInvoke and paste the following text in the Rule query statement. Replace the DEVICE_ID with your own device id and click Add.
SELECT cast(get(get(aws_lambda("arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME", *), "body"),
"predicted_label") AS String) AS state.desired.toilet_status
FROM '$aws/things/<<DEVICE_ID>>/shadow/update/accepted' WHERE state.reported.temperature <> Null
Be sure to replace the placeholders: change REGION to your current region as shown in the console header (it must be in the format us-west-2
and not Oregon
); change ACCOUNT_ID to your 12-digit account id, without hyphens, which is also shown in the console header menu where your username is printed; and change FUNCTION_NAME to the name of the AWS Lambda function you created (assumed name is classifyToiletStatus
). Don’t forget to update the «CLIENT_ID» placeholder in the FROM topic as well.
- xiii. Select the Permissions tab, then choose the link under Role name so you can add permissions for this Lambda function to invoke your SageMaker endpoint.
- xiv. From the new tab opened to the IAM console, under Permissions policies choose Add inline policy.
- xv. For Service choose SageMaker.
- xvi. For Actions choose InvokeEndpoint.
- xvii. For Resources choose All resources.
- xviii. Choose Review policy.
- xix. Give your policy a name like
invokeSageMakerEndpoint
and choose Create policy. You can now close this new browser tab.
Testing Lambda Function
- i. Go to your Lambda function and select the Code tab. If you didn't deploy your Lambda code click to
Deploy
- ii. You will get the confirmation that Changes deployed. Click on Test and choose Configure test event.
- iii. Give a name of the configuration like prediction_test and pase the sample JSON data (like the JSON uploaded by the device) in the code editor removing the previous data:
{
"state": {
"reported": {
"temperature": 29,
"humidity": 39,
"carbon_monoxide": 4.384765,
"ammonia": 0.687,
"nitrogen_dioxide": 0.141511,
"sulfur_dioxide": 0.441511,
"methane": 837.172485,
"sound": 23,
"toilet_status": "BUSY",
"fan_status": false
}
},
"metadata": {
"reported": {
"temperature": {
"timestamp": 1630912876
},
"humidity": {
"timestamp": 1630912876
},
"carbon_monoxide": {
"timestamp": 1630912876
},
"ammonia": {
"timestamp": 1630912876
},
"nitrogen_dioxide": {
"timestamp": 1630912876
},
"sulfur_dioxide": {
"timestamp": 1630912876
},
"methane": {
"timestamp": 1630912876
},
"sound": {
"timestamp": 1630912876
},
"toiletStatus": {
"timestamp": 1630912876
},
"fanStatus": {
"timestamp": 1630912876
}
}
},
"version": 560,
"timestamp": 1630912876,
"clientToken": "01231c94f550fe1c01-3"
}
- iv. Click to Create.
v. Now click on Test and you will get the following JSON response. It returns predicted_label for toilet_status with the probability. Note, in the data we provided the toilet_status was BUSY but ML model predicted it as READY based on the air quality data.
{
"statusCode": 200,
"body": {
"predicted_label": "READY",
"probability": "0.4254942536354065"
}
}
At this point, our IoT workflow is now consuming the trained machine learning model from its deployed endpoint to classify messages published by the device as new toilet_status values!
Adding an Action on toiletIotLambdaInvoke Rule
Go to toiletIotLambdaInvoke
rule to add an action to update the device shadow according to the prediction.
- Choose Add action.
- Select Republish a message to an AWS IoT topic and choose Configure action.
- For Topic, use
$$aws/things/<<CLIENT_ID>>/shadow/update
. Be sure to replace «CLIENT_ID» with your device’s client Id/serial number. - For Choose or create a role to grant AWS IoT access to perform this action. choose Create Role and in the pop-up give your new IAM role a name, then choose Create role.
- Choose Add action to finish configuring your action and return to the rule creation form.
- Click Create Rule to create this rule in AWS IoT rules engine.
This will update the device shadow as follows:
{
"state": {
"desired": {
"toilet_status": "UNCLEAN",
"fan_status": false
}
},
"metadata": {
"desired": {
"toilet_status": {
"timestamp": 1632032337
},
"fan_status": {
"timestamp": 1632032337
}
}
},
"version": 6735,
"timestamp": 1632032337
}
Sending Notification for an Unclean ToiletWhen telemetry data is published by the device, an IoT Rule triggers a Lambda function to read SageMaker prediction and update the device shadow accordingly. We want to send a notification (email) to a person involves in cleaning when the toilet_status predicted level is UNCLEAN. We want to send only one email every time the toilet_status changed from another state to UNCLEAN. For this reason, we will subscribe to the delta topic.
Step 1: Create an Amazon SNS topic that sends an SMS text message
Create an Amazon SNS topic.
- Sign in to the Amazon SNS console.
- In the left navigation pane, choose Topics.
- On the Topics page, choose Create topic.
- In Details, choose the Standard type. By default, the console creates a FIFO topic.
- In Name, enter the SNS topic name. For this tutorial, enter
toiletUncleanNotice
- Scroll to the end of the page and choose Create topic.The console opens the new topic's Details page.
Create an Amazon SNS subscription.
- In the
toiletUncleanNotice
topic's details page, choose Create subscription. - In Create subscription, in the Details section, in the Protocol list, choose SMS.
- In Endpoint, enter the number of a phone that can receive text messages. Be sure to enter it such that it starts with a
+
, includes the country and area code, and doesn't include any other punctuation characters. - Choose Create subscription.
Test the Amazon SNS notification.
- In the Amazon SNS console, in the left navigation pane, choose Topics.
- To open the topic's details page, in Topics, in the list of topics, choose
toiletUncleanNotice
. - To open the Publish message to topic page, in the high_temp_notice details page, choose Publish message.
- In Publish message to topic, in the Message body section, in Message body to send to the endpoint, enter a short message.
- Scroll down to the bottom of the page and choose Publish message.
- On the phone with the number you used earlier when creating the subscription, confirm the message was received.
If you did not receive the test message, double-check the phone number and your phone's settings.
Step2:Create an AWS IoT rule to send the text message
- In the AWS IoT Core console, choose Act then Rules then Create.
- Give your rule a name like sendSnsUnclean and a description.
- Use the following query. Be sure to replace the «CLIENT_ID» with the client Id/serial number printed on the screen of your Core2 for AWS IoT Edukit reference hardware kit.
SELECT state.toilet_status AS state.toilet_status FROM '$aws/things/«CLIENT_ID»/shadow/update/delta' WHERE state.toilet_status = "UNCLEAN"
The above query statements will generate the following JSON message when toilet status changes from another state to UNCLEAN
state.
{
"state": {
"toilet_status": "UNCLEAN"
}
}
- To open up the list of rule actions for this rule, in Set one or more actions, choose Add action.
- In Select an action, choose Send a message as an SNS push notification.
- To open the selected action's configuration page, at the bottom of the action list, choose Configure action.
- In Configure action:
- In SNS target, choose Select, find your SNS topic named high_temp_notice, and choose Select.
- In Message format, choose RAW.
- In Choose or create a role to grant AWS IoT access to perform this action, choose Create Role.
- In Create a new role, in Name, enter a unique name for the new role. For this tutorial, use
sns_rule_role
. - Choose Create role.
This rule will send emails in raw JSON format. We can format it in a user-friendly format using Lambda. In that case, we will select Send a message to a Lambda function instead of Send a message as an SNS push notification as a rule action and Lambda will send the message as SNS. So let's create a Lambda function first.
Formating the notification by using an AWS Lambda functionThe above tutorial is about how to Send an Amazon SNS notification, the JSON document that resulted from the rule's query statement was sent as the body of the text message. The result was a text message that looked something like this example:
{
"state": {
"toilet_status": "UNCLEAN"
}
}
In this tutorial, you'll use an AWS Lambda rule action to call an AWS Lambda function that formats the data from the rule query statement into a friendlier format, such as this example:
The toilet is very UNCLEAN and need to clean immediately. So, you are requested to take immediate action for cleaning. Thank you.
The AWS Lambda function in this project receives the result of the rule query statement, inserts the elements into a text string, and sends the resulting string to Amazon SNS as the message in a notification.
To create an AWS Lambda function that sends a text message
Create a new AWS Lambda function.
- In the AWS Lambda console, choose Create function.
- In Create function, select Use a blueprint.Search for and select the
hello-world-python
blueprint, and then choose Configure. - In Function name, enter the name of this function,
formatUncleanToiletNotification
. - In Execution role, choose Create a new role from AWS policy templates.
- In Role name, enter the name of the new role,
formatUncleanToiletNotificationRole
. - In Policy templates - optional, search for and select Amazon SNS publish policy.
- Choose Create function.
Modify the blueprint code to format and send an Amazon SNS notification.
- After you created your function, you should see the format-high-temp-notification details page. If you don't, open it from the Lambda Functionspage.
- In the format-high-temp-notification details page, choose the Configuration tab and scroll to the Function code panel.
- In the Function code window, in the Environment pane, choose the Python file,
lambda_function.py
. - In the Function code window, delete all of the original program code from the blueprint and replace it with this code. Replace the
notify_topic_arn
with the ARN from your notification topic.
import boto3
def lambda_handler(event, context):
# Create an SNS client to send notification
sns = boto3.client('sns')
# Format text message from data
message_text = "The toilet is very {0} and need to clean immediately.".format(
str(event['state']['toilet_status'])
)
# Publish the formatted message
response = sns.publish(
TopicArn = event['notify_topic_arn'],
Message = message_text
)
return response
- Choose Deploy.
Now go to the previous rule action again,
- Remove the previous action and click on Add action
- In Select an action, choose Send a message to a Lambda function.
- To open the selected action's configuration page, at the bottom of the action list, choose Configure action.
In Configure action:
- In Function name, choose Select.
- Choose format-high-temp-notification.
- At the bottom of Configure action, choose Add action.
- To create the rule, at the bottom of Create a rule, choose Create rule.
This is a screen recording of all the steps explained so far. It will be helpful for the beginner to follow the above steps. There is no voice in the video.
Detecting Water Leakage using SageMakerFollowing the same procedure, you can train the model for detecting water leakage. You need not use all the parameters for developing the model. Of course, the noise level will be the most valuable resource in this case. As I explain how I develop and train the ML model for Toilet Status using IoT Analytics, SageMaker, and Lambda, I didn't repeat the process here again.
Connecting NodeMCU to AWS IoT CoreIn our project Node MCU will control the exhaust fan based on the command received from AWS IoT Core. IoT Core rule will publish the controlling message (true or false) to a specific MQTT topic based on the air quality data received from the toilet. The rule works on the preset threshold of related important air quality parameters. If any parameter exceeds the threshold value then the IoT Core rule publishes a true message to a topic like node/mcu/fan/state. Node MCU receives this message and turns on the exhaust fan and vice versa.
For receiving the MQTT message from AWS IoT Core Node MCU must be connected to IoT Core. Follow the tutorial from the link (https://nerdyelectronics.com/iot/how-to-connect-nodemcu-to-aws-iot-core/) for the details about how you can connect a Node MCU to AWS IoT Core. This is a very nice tutorial on this topic.
You can also follow the video:
Developing Firmware for Node MCUAfter successfully connecting the Node MCU board to the AWS IoT Core we need to develop the firmware for the board so that it can receive the message from our specific topic and can control the exhaust fan accordingly.
The firmware was developed in Arduino and the following Arduino library is required for successful compilation.
#include "FS.h"
#include <ESP8266WiFi.h> //tested esp8266 core version: 2.5.2
#include <PubSubClient.h> //tested version: 2.7.0
#include <NTPClient.h> //tested version: 3.2.0
#include <WiFiUdp.h>
#include <ArduinoJson.h> //tested version: 6.18.4
For receiving the message we need to subscribe to a topic after a successful connection to the IoT Core like this
void reconnect()
{
// Loop until we're reconnected
while (!client.connected())
{
Serial.print("Attempting MQTT connection...");
// Attempt to connect
if (client.connect("ESPthing"))
{
Serial.println("connected");
client.subscribe("node/mcu/fan/state");
Serial.println("subscribed");
}
else
{
Serial.print("failed, rc=");
Serial.print(client.state());
Serial.println(" try again in 5 seconds");
char buf[256];
espClient.getLastSSLError(buf, 256);
Serial.print("WiFiClientSecure SSL error: ");
Serial.println(buf);
// Wait 5 seconds before retrying
delay(5000);
}
}
}
The following callback function receives the MQTT message and controls the fan
void callback(char* topic, byte* payload, unsigned int length)
{
Serial.print("Message arrived [");
Serial.print(topic);
Serial.print("] ");
for (int i = 0; i < length; i++)
{
message.concat((char)payload[i]);
}
Serial.print(message);
deserializeJson(doc, message);
bool fan_status = doc["state"]["desired"]["fan_status"];
Serial.println();
if(fan_status == true){
//turn on fan
digitalWrite(RELAY, HIGH);
Serial.println("Fan is ON");
}
else if(fan_status == false){
//turn off fan
digitalWrite(RELAY, LOW);
Serial.println("Fan is OFF");
}
message = "";
}
The full source code for Node MCU is attached in the code section.
Validating the NodeMCU FirmwareFor validation as Node MCU is receiving the MQTT message and controlling the fan go to the IoT Core test client page and publish the following message to the topic node/mcu/fan/state
.
{ "state": { "desired": {"fan_status": true } } }
You can also check by changing the fan_status to false. Open the Arduino Serial monitor and you will get the following response.
Congratulation! It's working. For controlling the fan you need to connect the exhaust fan to a digital pin of the Node MCU through a relay. I used D3 but you can use any other pin. See the following connection diagram for a better understanding.
We will create an IoT Core topic rule that receives the messages published by the device from the toilet, inspects the sampled temperature, humidity, ammonia, sulfur dioxide, and methane level, and updates the fan state of the device's shadow, and republish the message to another IoT Topic. The topic rule will use conditional logic in the SQL query to construct a new payload and the IoT Core republish action to send the new payload to the new topic.
- Go to the AWS IoT Core management console, choose Act, choose Rules, and choose Create.
- Give your rule a name and description. The name is like
healthryToiletFanStateRepublish
. - Use the following query and be sure to replace «CLIENT_ID» with your device client Id/serial number.
SELECT CASE state.reported.temperature > 35 OR state.reported.humidity > 50 OR
state.reported.ammonia > 3 OR state.reported.sulfur_dioxide > 2 OR
state.reported.methane > 7 WHEN true THEN true ELSE false END AS
state.desired.fan_status FROM '$aws/things/<<CLIENT_ID>>/shadow/update/accepted'
WHERE state.reported.temperature <> Null
- Choose Add action.
- Select Republish a message to an AWS IoT topic and choose Configure action.
- For Topic, use
node/mcu/fan/state
. - For Choose or create a role to grant AWS IoT access to perform this action. choose Create Role and in the pop-up give your new IAM role a name, then choose Create role.
- Choose Add action to finish configuring your action and return to the rule creation form.
- Click Create Rule to create this rule in AWS IoT rules engine.
The SELECT clause uses a CASE statement to achieve our simple threshold banding. If any of the air quality parameters like temperature, humidity, ammonia, sulfur dioxide, or methane exceeds the compared value then the fan state is modified to true (ON). You can modify your solution’s threshold or parameter as your situation.
The output of the CASE statement is saved to payload key state.desired.fan_status
with the AS keyword. This means we are creating a new payload like {"state": {"desired": {"fan_status": false}}}
and sending this payload on to the action.
state.reported.temperature <> Null
prevents the rule from refiring because the new shadow update payload only includes the state.desired.fan_status
key and no value for the state.reported.temperature
(other parameters e.g. humidity may be used).
All are now set for exhaust fan and it is ready to be controlled by IoT Core MQTT Message.
Video Demonstration of Complete ProjectWatch the demo of my project.
Thank You For Your AttentionReferences
- G. Parmar, S. Lakhani, and M. Chattopadhyay, “An IoT based low cost air pollution monitoring system, ” in 2017 International Conference on Recent Innovations in Signal processing and Embedded Systems (RISE), Bhopal, India, October 2017.View at: Publisher Site | Google Scholar
- K. Okokpujie, E. Noma-Osaghae, O. Modupe, S. John, and O. Oluwatosin, “A smart air pollution monitoring system, ” International Journal of Civil Engineering and Technology, vol. 9, pp. 799–809, 2018.View at: Google Scholar
- K. A. Kulkarni and M. S. Zambare, “The impact study of houseplants in purification of environment using wireless sensor network, ” Wireless Sensor Network, vol. 10, no. 03, pp. 59–69, 2018.View at: Publisher Site | Google Scholar
- World Health Organization, Air Pollution and Child Health-Prescribing Clean Air, WHO, Geneva, Switzerland, 2018, September 2018, https://www.who.int/ceh/publications/Advance-copy-Oct24_18150_Air-Pollution-and-Child-Health-merged-compressed.pdf.
- G. Rout, S. Karuturi, and T. N. Padmini, “Pollution monitoring system using IoT, ” ARPN Journal of Engineering and Applied Sciences, vol. 13, pp. 2116–2123, 2018.View at: Google Scholar
- B. C. Kavitha, D. Jose, and R. Vallikannu, “IoT based pollution monitoring system using raspberry–PI, ” International Journal of Pure and Applied Mathematics, vol. 118, 2018.View at: Google Scholar
- D. Saha, M. Shinde, and S. Thadeshwar, “IoT based air quality monitoring system using wireless sensors deployed in public bus services, ” in ICC '17 Proceedings of the Second International Conference on Internet of things, Data and Cloud Computing, Cambridge, United Kingdom, March 2017.View at: Publisher Site | Google Scholar
- J. Liu, Y. Chen, T. Lin et al., “Developed urban air quality monitoring system based on wireless sensor networks, ” in 2011 Fifth International Conference on Sensing Technology, pp. 549–554, Palmerston North, New Zealand, December 2011.View at: Publisher Site | Google Scholar
- United States Environmental Protection Agency, Managing air quality - air pollutant types, October 2018, https://www.epa.gov/air-quality-management-process/managing-air-quality-air-pollutant-types.
- C. Arnold, M. Harms, and J. Goschnick, “Air quality monitoring and fire detection with the Karlsruhe electronic micronose KAMINA, ” IEEE Sensors Journal, vol. 2, no. 3, pp. 179–188, 2002.View at: Publisher Site | Google Scholar
- S. Abraham and X. Li, “A cost-effective wireless sensor network system for indoor air quality monitoring applications, ” Procedia Computer Science, vol. 34, pp. 165–171, 2014.View at: Publisher Site | Google Scholar
- O. A. Postolache, D. J. M. Pereira, and S. P. M. B. Girão, “Smart sensors network for air quality monitoring applications, ” IEEE Transactions on Instrumentation and Measurement, vol. 58, no. 9, pp. 3253–3262, 2009.View at: Publisher Site | Google Scholar
- Y. Jiangy, K. Li, L. Tian et al., “MAQS: a personalized mobile sensing system for indoor air quality monitoring, ” in Proceedings of the 13th international conference on Ubiquitous computing, pp. 271–280, Beijing, China, September 2011.View at: Publisher Site | Google Scholar
- S. Bhattacharya, S. Sridevi, and R. Pitchiah, “Indoor air quality monitoring using wireless sensor network, ” in 2012 Sixth International Conference on Sensing Technology (ICST), pp. 422–427, Kolkata, India, December 2012.View at: Publisher Site | Google Scholar
- S. Zampolli, I. Elmi, F. Ahmed et al., “An electronic nose based on solid state sensor arrays for low-cost indoor air quality monitoring applications, ” Sensors and Actuators B: Chemical, vol. 101, no. 1-2, pp. 39–46, 2004.View at: Publisher Site | Google Scholar
- Ministry of Environment, Investigation results of Ministry of Environment, March 2019, http://www.me.go.kr/home/web/board/read.do?boardMasterId=1&boardId=727840&menuId=286.
- G. Marques, C. Ferreira, and R. Pitarma, “Indoor air quality assessment using a CO2 monitoring system based on Internet of Things, ” Journal of Medical Systems, vol. 43, no. 3, p. 67, 2019.View at: Publisher Site | Google Scholar
- M. Tastan and H. Gokozan, “Real-time monitoring of indoor air quality with internet of things-based E-nose., ” Applied Sciences, vol. 9, no. 16, article 3435, 2019.View at: Publisher Site | Google Scholar
- A. Rackes, T. Ben-David, and M. S. Waring, “Sensor networks for routine indoor air quality monitoring in buildings: impacts of placement, accuracy, and number of sensors, ” Science and Technology for the Built Environment, vol. 24, no. 2, pp. 188–197, 2018.View at: Publisher Site | Google Scholar
- M. Benammar, A. Abdaoui, S. Ahmad, F. Touati, and A. Kadri, “A modular IoT platform for real-time indoor air quality monitoring, ” Sensors, vol. 18, no. 2, p. 581, 2018.View at: Publisher Site | Google Scholar
Comments