I found a tiny Xiao ESP32S3 with an OV2640 camera sensor in a drawer, which seemed like the perfect match for this project. But how can I connect this camera to Alvik?
Even though the robot has what appear to be female connectors for every ESP32 pin, I wasn’t able to send or receive data—only the power pins worked. Well, that’s something. Since both the Alvik main board and the Xiao cam board use ESP32, I thought about using ESPNow, a fast communication protocol between ESP boards.
The procedure is pretty straightforward: you add the destination MAC address as a peer, and then you just send the data.
Alvik will move forward, taking pictures and sending them to a Machine Learning model. If the police (in the form of a Lego figure) is detected, a red light will appear in the RGB LEDs, and a turn will be performed. Of course, this is an arbitrary action that can be changed for anything else.
Additionally, Alvik will save a log file named log.txt with all the communications with the camera. So, there are several robot functions involved in this project: movement, communications, file management, lights, and Tiny Machine Learning.
Parts required- Arduino Alvik Robot
- Seeed Studio XIAO ESP32S3 Sense with cam
- 3d printed parts (cam case and support reels)
- 2 x male female jumper cables
I have taken around 110 pictures of the Lego police figure and a ball. I uploaded the pictures to Edge Impulse. Next, I created an Impulse and trained the model.
After training, I exported the trained model as an Arduino library using the EON compiler. Finally, I imported the zip file into the Arduino IDE.
I opened Documents\Arduino\libraries\Alvik_robot_vision_inferencing\src\edge-impulse-sdk\classifier\ei_classifier_config.h and changed
#define EI_CLASSIFIER_TFLITE_ENABLE_ESP_NN 1
#define ESP_NN 1
#endif // ESP32 check
#if defined(CONFIG_IDF_TARGET_ESP32S3)
#define EI_CLASSIFIER_TFLITE_ENABLE_ESP_NN_S3 1
to
#define EI_CLASSIFIER_TFLITE_ENABLE_ESP_NN 0
#define ESP_NN 1
#endif // ESP32 check
#if defined(CONFIG_IDF_TARGET_ESP32S3)
#define EI_CLASSIFIER_TFLITE_ENABLE_ESP_NN_S3 0
Finally, before uploading, I selected OPI PSRAM in IDE/Tools.
Note: if you get error 2 while uploading you have to press rst and boot microbuttons with a pencil or some tiny object until you hear USB beep.
CircuitActually there is no circuit. Just connect the power lines from ESP32 to the cam.
I found a well-designed case for the XIAO ESP32S3 Sense, but I still needed to create some kind of support. So, I modeled two support bars. These bars are connected to Alvik using a pair of M3 screws.
The camera is placed at the back of the robot; otherwise, the weight would cause Alvik’s nose to point downward.
Cam CodeI have added https://raw.githubusercontent.com/espressif/arduino-esp32/gh-pages/package_esp32_index.json in Arduino IDE, Preferences, Board manager
I configured XIAO_ESP32S3 as the board.
I have modified the default EI inference cam script to send detection in a coded format. Example 1 for police, 2 for balls, etc
Before uploading I also configured Alvik Mac address. To get the Mac address you should run this script in Alviks first:
import network
import ubinascii
wlan_sta = network.WLAN(network.STA_IF)
wlan_sta.active(True)
wlan_mac = wlan_sta.config('mac')
print(ubinascii.hexlify(wlan_mac).decode())
Then copy and paste the MAC to.ino code
uint8_t broadcastAddress[] = {0x00, 0x00, 0x00, 0x00, 0x00, 0x00};
You can also setup the confidence level. Example: 90%
float detectionConfidence=0.9;
Alvik CodeAlvik’s is capable of running C++ and Micropython. For this project I have coded a script in Micropython to move while listening for cam signals. In the case of a police detection, Alvik will turn around and display a red light.
In Arduino Labs for MicroPython, select you files folder. Connect Alvik using USBC cable and click on connect button. Upload the.py file to Alvik and modify main.py to include AiLink.py so the script starts when Alvik starts.
SpeedYou may notice that for this project, Alvik moves slowly, and there is a reason for that. The Xiao ESP32S3 Sense needs to take pictures and make inferences. Thanks to Edge Impulse optimizations, the process is really fast even with limited resources. However, if you move quickly toward the object, you might accidentally run over the Lego figure before the image recognition completes
Anyway, if you want to change the speed you will have to edit the -10 number for -20 for example
alvik.set_wheels_speed(-10, -10)
Note: It is a negative number since the robot is going backwards (the camera is placed in the back)
DemoFinal notesI’m sure there will soon be documentation about integrating a faster AI camera using Grove connectors and MicroPython code only. In the meantime, this was a fun project to make.
More Alvik projectsBesides this project, I also made three other programs:
Find Parking Spot and send Telegram alerts
Explore and Report to a Remote Screen
Feel free to ask if you need further assistance or have any other questions!
Maker Counterculture talk
Comments