This is a proof of concept project. The Thingy:53 is an incredible IoT prototyping platform and its integrated sensors make it particularly suited to environmental sensing. It would be great if I could put one in every room of my house. Unfortunately, that is not within my maker's budget. My solution to this problem is to create a mobile platform to transport the Thingy:53 to the different rooms to periodically monitor the quality of the environment using the integrated Bosch BME688 sensor.
My long term goal is to monitor the performance of my HVAC system and possibly have some sort of control feedback. My short term goals for this project are to demonstrate that I can successfully move the Thingy:53 between rooms and to develop a model using the Edge Impulse integration to detect environmental anomalies.
Mobile PlatformThis aspect of the project caused me more difficulty than I expected. I started out using a Robobloq Qoopers Robot Kit because I had one available. I ended up having some issues with it and switched to using a Freenove Smart Car Kit instead, but since it did require a bit of time and effort I thought I would document both.
The Qoopers is targeted toward STEM users and has very good build quality. It has 6 different build configurations and is mechanically easy to customize. The configuration that I built took less than an hour. The Thingy:53 case does not have any provisions for mounting, so I needed to print a mounting base to attach it to the Qoopers. The Thingy snaps into the base.
The pictures below show the kit parts and my build configuration with the Thingy mounted on the upper rear deck. The video shows my test of controlling the Qoopers via BLE using an app on my tablet. I had difficulty with the movement responsiveness using BLE and also realized that it would be difficult controlling a robot with the precision required to move from room to room without some simple location feedback (I had hoped to do it programmatically). The Qoopers uses a Q Mind+ controller (Arduino Mega 2560 based). The firmware is programmable using MyQode (block programming), Arduino IDE, or Python. I had problems finding the information that I needed to program the BLE interface and had GPIO resource conflicts with no available hardware UART and no line tracking sensor, so I decided to switch to a more capable robot.
Robobloq Qoopers Robot Kit
.
Freenove 4WD Raspberry Pi Car Kit
I decided that the easiest and most reliable navigation from room-to-room would be to use the line following method. I switched to using a robot car based on the Raspberry Pi platform. The robot kit does not include the RPi, so I added a RPi 3B+. The kit parts and the intermediate build steps are show in the pictures below. I 3D printed a modified base to attach the Thingy to standoffs above the RPi 3B+.
The kit provided the additional features:
- 4 wheel drive
- Line following array
- 5MP RPi camera V1.3
- Pan/Tilt assembly for camera/ultrasonic sensor
- Remote control via WiFi
- 18650 rechargeable Lithium batteries
.
The Smart Car is running on a standard RPi OS (Bullseye), so I won't go through the installation and configuration process. I enabled WiFi, SSH, VNC and the camera.
Freenove has a repository https://github.com/freenove/freenove_4wd_smart_car_kit_for_raspberry_pi of Python code that supports the Smart Car functions. In particular there is a server function that interfaces with a Windows, Mac, or iOS application to control the Smart Car. I set that server to automatically run on boot.
The first check was to verify that that the interface was responsive and that the Smart Car would operate correctly on the carpet that connects the rooms that I want to monitor. The first video is the iOS application running on my iPad. I used AirDroid Cast to mirror the screen to my PC for recording. The control wheel on the lower left is for vehicle motion and the control on the right is for the camera/ultrasonic sensor pan/tilt.
The car is fairly responsive but takes a bit of practice to drive using the screen control.
The second check is to verify that the car would follow a track and return correctly. I initially tried printing the track pattern on paper using an inkjet printer, but I could not get line tracking sensors to work reliably within their range of adjustment. I found that the sensors work really well if I use black electrical tape to create the track pattern.
The next video shows the car tracking a pattern taped on my garage floor. I'll need to figure out how to implement this on the carpet.
A couple of images of what the camera sees.
The integration of the Edge Impulse framework with the Thingy:53 makes generating and deploying machine learning models extremely straightforward. Out-of-the-box, the Thingy:53 is pre-loaded with firmware that allows you to sample raw data, build models, and deploy trained machine learning models using the nRFEdgeImpulse app that is available for iOS and Android devices.
To use the app, you need to sign in to your Edge Impulse account and create a project for the app to interface with. My project dashboard is shown below.
Then on the app you need to log into your Edge Impulse account and connect the Thingy:53 via BLE on the Device tab.
When the Thingy:53 connects, you will see the device info and sensor capabilities.
And the Thingy:53 will show up on the Devices page of your project.
Now you can upload data either using the Data Acquisition page of your project or the Data Acquisition tab of the app as shown below.
Initially, I thought that the gas sensor could be used to classify the 4 different rooms. We have new flooring in all 4 rooms, carpet in 2 bedrooms and 2 different types of laminate flooring in the other bedroom and bathroom. At least my nose can perceive a difference in the rooms and a quick check indicated that a measurable difference - very small but repeatable.
I proceeded to collect a bunch of data. I took 10 environment (temperature, humidity, pressure, gas) measurements in each room at 7 different intervals during the day (to get the daily variation). I also took the same number of measurements in other areas (outside, garage) to create an "Other" class. I ended up with 46 minutes and 40 seconds of data.
Then split the data between Train/Test sets.
Designed an impulse with a Flatten processing block to look at the min/max/avg of measurements in each sample (this might be superfluous since the data doesn't change much in the 8 second sample window). Added a learning block for classification and one for anomaly detection.
The results in the Feature explorer were a bit disappointing. It seems that the variation gas sensor readings over the course of a day were greater that the small amount of difference that I saw between rooms. I wonder if the fact that the BME 688 is not well exposed in the Thingy case could reduce its responsiveness. Humidity had the highest Feature importance.
I decided to try training a model with the dataset anyway, to see how it would perform.
The results were fairly miserable as might be expected from this dataset. So, being able to identify a room based on its environmental sensor data isn't going to work.
Anomaly detection, however, worked quite well.
Trained the Anomaly detection with gas and temperature axes
And gave it a low temperature sample.
Resulted in a clear anomaly detection.
It also worked with a high temperature sample. This one is really distinct because I left the unit in the sun too long and had greater than 15C higher than the normal range.
Unfortunately, family issues prevented me from having enough time to build and deploy a new model and to write the code to integrate the Smart Car control with the Thingy:53 within the contest timeframe. I thought that it was important to document what I did accomplish.
I want to thank Hackster, Nordic Semiconductor, and Edge Impulse for the opportunity to participate in the contest.
I need to do further work with the BME688 to determine the sensitivity of the gas sensor. I'd like to compare sensor measurements using a Nicla Sense ME to measurements using the Thingy:53 to see how much effect the Thingy:53 case is having. It may be that I need to have a much longer settling time when I move between rooms before taking measurements.
Comments