With the recent advancements in transportation and smart vehicles, I was surprised to find that electronic vehicles and smart cars still lack the ability to measure and analyze real-time motion data given the significance of insights possible. Realizing the value of the kinetic data and the diversity of insights, I decided to create a fully connected IoT device that can quantify 3D acceleration data and enable real-time analyses and visualization of that 3D motion data. Furthermore, I coupled this data with noise and temperature information to create additional context and implement safety features.
P.S. I have submitted this project in response to the AWS IoT Challenge. It would mean a lot to me if you could vote for it through this link, as I have worked very hard to accomplish this!
Use CasesThe 3D acceleration data has many use-cases, however I find applications for driving particularly interesting. The data can be used to:
- Detect unusual changes in vehicle performance, over time
- Measure the quality of a ride by quantifying g-force
- Detect life-threatening car accidents and notify authorities
- Map road hazards, such as potholes using the z-axis
The micro-controller is programmed to measure 3D acceleration, temperature and noise every 1 second! It then sends that data to the Omega2+ using UART protocol. The Omega2+ then sends that data to AWS IoT using MQTT protocol.
Upon arrival of data at AWS IoT, any appropriate action is taken in real-time if necessary, such as sending an emergency notification. The data arrival also triggers a AWS IoT rule, which sends the data through the first firehose. Then comes AWS Kinesis, which receives the data in batches from the first firehose and performs real-time analytics, such as calculating the aggregate g-force in each direction. Then the data flows through the destination firehose and is stored in a S3 bucket. Finally, the data can be visualized and further analyzed using Amazon QuickSight.
The HardwareThere are two pieces of hardware used in this project, one is the Onion Omega2+, which is an embedded Linux computer and the other is the Adafruit Circuit Playground, which is a micro-controller and is used here for its set of comprehensive on-board sensors and cool-looking lights!
To test this for yourself, you will need to create an AWS account and follow the below architecture to set up the appropriate services. The following diagram will help you visualize the AWS architecture and create services used. Additionally, all the services and the setting for each service is listed.
Step 1. Create a S3 bucket named "omega2-data" with all the default settings.
Step 2. Create the following AWS Kinesis Firehose delivery streams:
- IoT_Source
- IoT_Dest_Data
- IoT_Dest_Aggregate_XYZ
- IoT_Dest_Aggregate_Temperature
Settings: Change source to Direct PUT and S3 buffer interval to 60 seconds for all the firehoses. Additionally, select "omega2-data" as the S3 bucket in use. Use prefixes for each data stream to keep things organized.
Step 3. Create a new thing in the AWS IoT console and use the following parameters.
- Rule query statement:
SELECT Noise, Temperature, X, Y, Z, deviceID, dateTime FROM 'omega2/3D0D'
- Actions:
"Send messages to an Amazon Kinesis Firehose"
Stream name: IoT_Source
Separator: \n (newline)
Step 4. Create an AWS Kinesis Analytics application named "IoT_Data_Analytics" with the following parameters:
- Source: Firehose delivery stream "IoT_Source"
- Real time analytics SQL code:
CREATE OR REPLACE STREAM "DESTINATION_SQL_Data_STREAM" (Noise INTEGER, Temperature DECIMAL(4,2), X DECIMAL(4,2), Y DECIMAL(4,2), Z DECIMAL(4,2), deviceID VARCHAR(4), dateTime TIMESTAMP);
CREATE OR REPLACE PUMP "STREAM_PUMP_1" AS INSERT INTO "DESTINATION_SQL_Data_STREAM"
SELECT STREAM "Noise", "Temperature", "X", "Y", "Z", "deviceID", "dateTime" FROM "SOURCE_SQL_STREAM_001";
CREATE OR REPLACE STREAM "DESTINATION_SQL_AGGREGATE_STREAM" (dateTime TIMESTAMP, maxX DECIMAL(4,2), minX DECIMAL(4,2), maxY DECIMAL(4,2), minY DECIMAL(4,2), maxZ DECIMAL(4,2), minZ DECIMAL(4,2));
CREATE OR REPLACE PUMP "STREAM_PUMP_2" AS INSERT INTO "DESTINATION_SQL_AGGREGATE_STREAM"
SELECT STREAM FLOOR("SOURCE_SQL_STREAM_001".ROWTIME TO MINUTE) AS "dateTime", MAX("X") AS "maxX", MIN("X") AS "minX", MAX("Y") AS "maxY", MIN("Y") AS "minY", MAX("Z") AS "maxZ", MIN("Z") AS "minZ" FROM "SOURCE_SQL_STREAM_001" GROUP BY FLOOR("SOURCE_SQL_STREAM_001".ROWTIME TO MINUTE);
CREATE OR REPLACE STREAM "DESTINATION_SQL_AGGREGATE_TEMP" (dateTime TIMESTAMP, maxTemperature DECIMAL(4,2), minTemperature DECIMAL(4,2));
CREATE OR REPLACE PUMP "STREAM_PUMP_3" AS INSERT INTO "DESTINATION_SQL_AGGREGATE_TEMP"
SELECT STREAM FLOOR("SOURCE_SQL_STREAM_001".ROWTIME TO MINUTE) AS "dateTime", MAX("Temperature") AS "maxTemperature", MIN("Temperature") AS "minTemperature" FROM "SOURCE_SQL_STREAM_001" GROUP BY FLOOR("SOURCE_SQL_STREAM_001".ROWTIME TO MINUTE);
- Destinations:
Step 5. Create a new visualization using the S3 bucket data. Use manifest.json file with the following code:
{
"fileLocations": [
{
"URIPrefixes": [
"https://s3.amazonaws.com/omega2-data/data/"
]
}
],
"globalUploadSettings": {
"format": "CSV",
"delimiter": ",",
"containsHeader": "false"
}
}
Comments