A project utilizing the Hackster EEDU Kit - Getting Started with AI,
The aim of this project is to create a Rehearsal Room Occupancy System to solve the problem of scheduling conflicts for music students at the University of Massachusetts in Amherst. The system will allow students to easily sign up for time in rehearsal rooms and provide real-time occupancy tracking to ensure that students are aware of available rooms.
The Rehearsal Room Occupancy System is an innovative solution designed to enhance the student experience at the University of Massachusetts in Amherst. By allowing students to easily sign up for time in rehearsal rooms and providing real-time occupancy tracking, the system streamlines the process and ensures that students have access to the necessary resources to pursue their musical studies.
I was awarded the Hackster EEDU Kit - Getting Started with AI, The kit contains a HuskyLens AI Vision sensor, a Firebeetle ESP32-E controller, an IO shield, a relay module, a DFPlayer Pro-A mini MP3 player, a speaker, and 10 packs of jumper wires. It is designed to help users jumpstart their AI projects.
At the heart of the system are the Firebeetle ESP32+ IO Shield and the HuskyLens.
The system will consist of an ESP32 board connected to the HuckyLens to recognize students. The board will communicate with the Arduino Cloud to provide real-time occupancy data that will be displayed on a screen. the relay will be used to unlock the rehearsal room door. And the DFPlay+MP3 Speaker, will be used to welcome the student Additionally, if time permits, the system will be integrated with a web application that will display real-time room availability and allow students to reserve time slots in the rehearsal rooms. The system will also track occupancy and provide alerts when the rooms are full, preventing students from reserving a room that is already occupied.
The main features of this system include real-time occupancy tracking, a web application for reservations, and alerts for full rooms.
The University of Massachusetts Amherst is the school where the system will be implemented, and the project is a proof of concept design and implementation using the contents of the EEDU AI kit.
I am presently a retired senior software engineer with a background in computer science I have extensive experience with embedded systems, firmware development environments, and open-source embedded development tools, along with knowledge of various programming languages and communication protocols.
I received the EEDU AI Kit on April 14, 2023. There is a great unboxing video by Alex, that really describes the kit. The AI kit start at timestamp: 3:40
video by Alex: https://www.youtube.com/live/jXMwtu0voF4?feature=share&t=220
Some other real well done YouTube Video's on the Use of the HuskeyLens
Easy Machine Learning on Arduino/Raspberry Pi with DFRobots HuskyLens! - YouTube
Machine Vision with HuskyLens - YouTube
HuskyLens: 6 AI Algorithms in the Palm of Your Hand - YouTube
Experiment with the kit components;I read through the Getting Started Guide which shows how to develop with ESP32 / ESP8266 development board and the Arduino Cloud. I tried 2 of the projects that pertain to my design. Number 1. Face Recognition Switch and Number 4 VOC Index Monitoring (the second example MP3 Player). I referred back to this page when I was Designing and developing content for my project.
Experiment with the Face Recognition Switch project, which uses the DFRobot HuskyLens camera to recognize the human face. If a learned face is recognized, the relay turns on. If the face is unlearned, it stays off. The project is an excellent example demonstrating how to wire the HuskeyLens to the IO Shield and program the ESP-32 on the Arduino. This is the project of the three which uses the components in the AI kit.
Face Recognition Switch project Results:
- Connect all components,
- get example code to my cloud account, Build and program firmware to the ESP32 board.
- Run and Display example running.
To test the DFPlayer, I used this example from section 4 "VOC Index Monitoring".
I had to solder the headers onto the DFPlayer and mount it on a Breadboard to make the connections to the ESP32 Shield. I used the following wire diagram in the example to connect the DFPlayer.
Wires from DFPlayer to the Gravity Shield
RED WIRE - DFPplayer + Pin to Shield + Pin
BLK WIRE - DFPlayer GND Pin Shield - Pin
GREEN WIRE - DFPlayer RX Pin Shield D10 Pin
BLUE WIRE - DFPlayer TX Pin Shield D11 Pin
Wire from the DFPlayer to the Speaker.
RED WIRE - Speaker Red wire to DFPplayer R+ Pin
BLK WIRE - Speaker Black wire to DFPplayer R- Pin
Here's a photo showing the connectons.
Transfer audio files into the MP3 Player module
Before use, connect the module to the computer using a USB cable and store the audio files in the MP3 player module.
Name the audio files as follows. Use the function 'DF1201S.playFileNum(/*File Number = */1);' to play the audio file with the specified number."
The code is Test_MP3_Player_DFR0768.ino and can be found in the code section below. When powered on, the module enters music mode and starts playing audio files, and the serial will print the current operation.
Using the HuskeyLens with Edge ImpulseI decided to try and use EDGE Impulse platform to build a dataset and deploy ML to use the Huskylens to learn the ID card and Face recognition..Hopefully this section will qualify for a chance at the "Best Use of AI Award - Sponsored by Edge Impulse"
I researched how to use Huskeylens With Edge impulse and these are the steps I followed.
- Set up HuskyLens: Make sure you have properly set up and connected your HuskyLens device. You can refer to the HuskyLens documentation for instructions on how to connect it to your computer or microcontroller.
- Create an Edge Impulse account: Go to the Edge Impulse website (https://www.edgeimpulse.com/) and create a new account if you don't have one already.
- Create a project: Once you're logged in to your Edge Impulse account, create a new project by clicking on "Create a new project." Give your project a name and proceed to the next step.
- Collect training data: In the Edge Impulse studio, you need to collect training data for your machine learning model. You can use the HuskyLens to capture images and label them accordingly. Capture images of those different ID cards (Driver licenses, College ID, Credit Card) and assign appropriate labels to them. Then Capture student Photo's and assign Appropriate labels to them.
- Upload and label the data: Upload the training images to Edge Impulse and assign the corresponding labels. This step helps the system understand the patterns and train a model based on the provided data.
- Train the model: Once you have uploaded and labeled your training data, proceed to train the model. Edge Impulse provides various machine learning algorithms and options to train your model. Select the appropriate settings and start the training process.
- Evaluate and test the model: After the training is complete, evaluate the performance of your model using the provided metrics and tools. You can also test the model with new images to see how well it performs.
- Export the model: Once you are satisfied with the model's performance, export the model from Edge Impulse. You will receive a file or files containing the trained model that can be used with your HuskyLens.
- Implement the model on HuskyLens: Take the exported model and follow the instructions provided by HuskyLens on how to integrate custom machine learning models. This typically involves uploading the model to the device and configuring it to use the model for inference.
- Deploy and use the model: With the model integrated into HuskyLens, you can now deploy it and use it for real-time Student ID detection and Student Photo recognition.
To implement the model on HuskyLens, you'll need to follow the device-specific instructions provided by the HuskyLens documentation. Here's a general outline of the steps you need to follow:
- Prepare the model: Ensure that you have the model file exported from Edge Impulse in a compatible format. HuskyLens typically supports models in formats like TensorFlow Lite (
.tflite
) or ONNX (.onnx
). Check the HuskyLens documentation to confirm the supported model formats. - Connect HuskyLens to your computer: Connect your HuskyLens device to your computer using the appropriate interface, such as USB or Wi-Fi, as specified by the device's documentation. Ensure that the device is powered on and recognized by your computer.
- Access the HuskyLens interface: Open the HuskyLens graphical user interface (GUI) or use the command-line interface (CLI) provided by the device. The interface allows you to configure and manage the device's settings, including deploying custom models
- Upload the model: Using the HuskyLens interface, locate the option to upload or deploy a custom model. Follow the instructions provided by HuskyLens to upload the model file you exported from Edge Impuls
- Configure the model settings: Once the model is uploaded, you may need to configure additional settings, such as input/output dimensions, preprocessing options, or any specific requirements of the HuskyLens device. Refer to the HuskyLens documentation for guidance on configuring the mStart the model deployment: After the model and settings are configured, initiate the deployment process through the HuskyLens interface. This step will activate the model and enable it to perform inference on the input data it receives.
- Test the model: With the model deployed on HuskyLens, you can now test its performance. Use the device's camera to capture images or input data, and observe the model's output or predictions. Ensure that the results align with your expectations and meet the desired accuracy.
This section includes a design diagram of the Rehearsal Room Occupancy System. The diagram depicts the kit components wiring and the action and reactions when a music student presents there Student ID to the HuskeyLens camera.
I used this ID card to test the system. I used the ML capabilities of The HuskeyLens and Edge Impulse Studio to recognize this id as a Valid Card to let me into the rehearsal room.
- When a music student presents there Student ID to the HuskeyLens camera, the ID is processed using the object recognition logic in the husky lens.
- Results of the Scan are sent to the ESP32 for processing. The data is sent over the Gravity connector supplied.
- The ESP32 processes the data by selecting conditions based on the ID card.
- Valid Card Condition: If the Id card is recognized as a valid college Id card then the picture on the card is processed by using Face Recognition on the HuskeyLens.. If the Picture is recognized as a valid Music student, then the esp32 will unlock the rehearsal room door and welcome the student to the room by using audio from the MP3 player over the attached speaker.
- Invalid Card Conditions: There are several conditions that indicate an invalid ID card.: A invalid Colleges ID, A valid ID but the student is not a music major, the ID card is not recognized as a ID card at all. For these conditions the Door does not unlock and the audio from the speaker, gives the reason for not allowing access.
- The HuskeyLens is programmed to recognize Valid or invalid ID cards.
- The HuskeyLens is trained to recoginize A valid college id and r
- The ESP32 is programmed to handle several conditions sent from the HuskeyLens.
- Based on a valid ID card (Valid College ID card and the Student is a Music major) the Really is sent a command to switch the door lock
- Also based on the condition results of the scan the DFPlayer is sent the proper MP3 file to play over the speaker.
- Testing
For the testing I created 2 sketches based on the 2 example above I also tested the training of the ID cards for the Edge Impulse model that I mentioned above. I also tested the Relay because, the relay connections in the example Test_ID_Reginition was wired to digital pin 6, and this did not work.
- The Testing CODE
NAME: Test_Relay - example code
This code uses Pin 2 instead of Pin 6
NAME: Test-ID-Recognition - example code
This code was used to test the Huskeylens and was taken from "Face Recognition Switch" example described above. It can be found in the code section on this page.
NAME: MP3 Player (DFR0768) - example code
This was used to test the DFPlayer and was used to test the connections mentioned above.
- Running the system
For the final sketch, I combined the two test into one sketch.
- The Final CODE
NAME: RehearsalRoomOccupancy
This is the final code combining the 3 code examples and The Edge Impulse Model. It can be found in the code section on this page.
Video of the operation of the prototype
Project ConclusionsThis concludes my project for my entry into the Smart Campus 2023 challenge with Arduino, DFRobot, and Edge Impulse, here on Hackster. I did learn a lot from this experience and, I was able to experiment with AI Face and Object recognition using the DFRobot HuskeyLens vision sensor, the Firebeetle ESP32 board and Gravity I/O Shield using the Arduino Cloud, DFRobot Gravity connectors, and the DFPlayer & Digital Relay modules. I also gained more ML experience with Edge Impulse Studio.
Arduino, is a powerful and versatile platform for developing IoT solutions. It allows for easy integration with various sensors and actuators, enabling the creation of smart and connected devices. DFRobot provides reliable and user-friendly hardware components that are well-suited for IoT projects. Their products offer a good balance of performance, affordability, and ease of use. Edge Impulse is an effective machine learning platform for developing AI models on edge devices. It simplifies the process of collecting, labeling, and training data, making it accessible even for developers without extensive ML experience. Together Integrating these 3 technologies together, opens up a wide range of possibilities for data analysis and decision-making in a smart campus setting.
Participating in challenges like the Smart Campus 2023 has provided me with valuable hands-on experience in developing real-world IoT applications. It allows you to learn from others, discover new technologies, and showcase your skills and creativity. Collaborative platforms like Hackster.io foster a supportive and engaging community of developers. Engaging with this community can lead to new connections, learning opportunities, and inspiration for future project
Developing IoT solutions for a smart campus setting has the potential to improve various aspects of campus life, such as energy efficiency, security, and resource management. It highlights the importance of technology in creating smarter and more sustainable educational environments.
Not all smooth sailingI did have Problems implementing the project, which I'll describe in the following section.
I had a little trouble connecting the FireBeetle ESP32-e to the Arduino Cloud. I have extensive experience using it with Arduino Boards. I had trouble getting the correct board definition. I could not program the firmware onto the board. I was getting the following error pictured below.
I was able to get help from the community and resolved my issue.
Here is what I found to get the FireBeetle ESP32 connected to the Arduino Cloud EditorTo connect the FireBeetle ESP32 board to the Arduino Cloud Editor, follow these steps:
- Ensure that you are using a PC with Windows 10 (64-bit) operating system. If so, download and install the 64-bit Windows Create Agent, which is available from the Arduino website.
- After installing the Create Agent, you will be prompted to install the COM driver. Click on the prompt to proceed with the installation.
- Locate the Create Agent icon in the system tray on your PC. Right-click on it and select "Go to Arduino Create." This will open your default web browser (in my case, MS Edge) and connect you to the Arduino Cloud.
- Depending on whether you are logged in or not, you may be directed to the login page or directly to the web editor. If you are logged in, navigate to the web editor to proceed. For simplicity, let's focus on accessing the web editor this way for now. There are other ways to access the WEB Editor.
- Once you are in the web editor, connect the FireBeetle ESP32 board to your PC using a USB cable. You should hear a beep, and a message will pop up saying "COM port is connected." However, the FireBeetle board may not be recognized yet.
- To connect the FireBeetle board and configure it within the web editor, follow these steps. Please note that these steps may vary slightly depending on your specific setup, but the general process remains the same.
- The board selector pulldown selector bar shows up. Your screen will be different, but the Boards selector screen will be visible. Pull down the selector button and select Board COM X. On this PC it’s COM4.
- Once selected, a dialog box shows up. Now enter in Firebeetle in the search box an all the supported boards will be displayed. Select FireBeetle-ESP32. Use the default parameter to the Right And Press OK Now you can connect and sync the Firmware as described.
- A. The board selector pulldown selector bar shows up. Your screen will be different, but the Boards selector screen will be visible. Pull down the selector button and select Board COM X. On this PC it’s COM4.
- B. Once selected, a dialog box shows up. Now enter in Firebeetle in the search box an all the supported boards will be displayed. Select FireBeetle-ESP32. Use the default parameter to the Right And Press OK Now you can connect and sync the Firmware as described.
Please note that if you are using a new PC with the Create Agent for the first time or switching between different PCs, you will need to repeat these steps to configure the programmer and connect the FireBeetle board. This might seem a bit confusing if you have previously used the Arduino Cloud with other boards, as the recognition process may differ.
Possible Future Enhancementin this section I will list other functionality that could be added to make the system better.
- Originally I mentioned a database and a web interface to schedule rooms. This functionality was not designed into the present system due to time constraints. This would make sense for the next iteration of the system.
- I could add more AI recognition capabilities for the entry conditions.
- I could connect the values of the Occupancy of the rehearsal rooms to the Arduino Cloud using a dashboard.
- I would like to get the code to run on the Arduino IDE. The Arduino Cloud Web Editor, makes installing library support much easier than the IDE running on the PC. But I like the Debug capabilities of the IDE on the PC.
1.Gravity: Huskylens - An Easy-to-use AI Camera Vision Sensor WikiPage
Projects:1. AI Projects made simple by Huskylens2. DIY Line Tracking Robot with Huskylens and Romeo3. Easy Machine Learning on Arduino/Raspberry Pi with HuskyLens!4. DIY Amazing Object Tracking Robot using Huskylens5. Can a $4 Raspberry Pi Pico Run an AI Project?
2.FireBeetle 2 ESP32-E IoT Microcontroller with Header (Supports Wi-Fi & Bluetooth) WikiPage
Projects:1. Make a Home-made Network Clock with ESP32-E2. DIY 3D Printed IoT Weather Station Using an ESP32
3.Gravity: IO Shield for FireBeetle 2 (ESP32-E/M0) WikiPage
4.Fermion: DFPlayer Pro - A mini MP3 Player with On-board 128MB Storage) WikiPage
5.Stereo Enclosed Speaker - 3W 8Ω Product Webpage
6.Gravity: Digital 10A Relay Module WikiPage
Comments