This article is posted on my website which includes more information about the code if interested. I also have some links on my site to details about the kit and another project using a DFRobot voice cognition module. I'm grateful to Adeept for giving me the car to use for the projects. I have a few Adeept vehicles including their tank one and they are all pretty cool. I really like the motor shields as they have a lot of functionality built in.
In addition I'm grateful to the team at Seeed Studios for the Grove Vision AI Module V2 which I'm using with this project. I was given one of the units to use for the Smart Eyes contest they are hosting.
With this project I wrote Arduino code for controlling the motors of the omnidirectional robot. The robot kit is designed to work with CircuitPython but given it comes with the associated schematic it's easy to get the pin-out and convert it.
The key to the omnidirectional robotic movement is the unique layout of the wheels and their associated rollers. As a result of the configuration it is capable of moving in all 9 directions plus it can turn depending upon which wheels it utilizes for the movement.
To hold the camera I designed a mount for the Vision Module V2 which could hold it along with the module: https://www.printables.com/model/922657-seeed-vision-sensor-v2-mount-for-adeept-omni-direc
This mount attaches to the car via an attachment point near the front. I decided against using the front face of the robot and its associated servo to keep the project complexity in check given robotics are a new field for me.
Preparing the RobotFor getting Arduino configured you can see the guide from Espressif. The short of it is you need to add the board support package located at
https://espressif.github.io/arduino-esp32/package_esp32_index.json
Then in the boards tab type “Espressif” to find and install inside Arduino.
Once you have the code ready to be flashed you will need to go into bootloader mode to flash the device. In a sense you play a game of "Operation" with the Banana Pi PicoW S3 and carefully use a pair of tweezers to hit the "BOOT" solder points while you trigger the button to reset.
For the configuration options in Arduino I relied on the Banana Pi Leaf S3 related documentation which worked in this case as well:
In my code I included three projects. The first two are for testing that everything is setup correctly. I had to write firmware for the C++ control of the motor shield and as such I wanted to ensure I didn't make any mistakes. The motor test code tests the servo control and the motors ensuring the device is acting properly.
I want go into all of the code here but will focus on the vision sensor elements:
int personX = Infer.boxes()[0].x;
int personWidth = Infer.boxes()[0].w;
bool moveLeft = personX < 80;
bool moveRight = personX > 140;
bool moveForward = personWidth < 45;
bool moveBackward = personWidth > 50;
bool turnLeft = personX < 40;
bool turnRight = personX > 180;
The servo control logic shown in the SSCMA examples was a great starting point to learn how to use the API response here. I did some experiments with my face at various distances to determine the size of the box when I was close and further away. The x
value here is located at the center of the bounding box and the w
value is the width of the box.
I then used the above variables to control the car’s actual movement based on their state.
if (turnLeft) {
Serial.println("Turn Left");
motorControl.moveCar(MOVE, TURN_LEFT, 50);
delay(100);
motorControl.stopAllMotors();
lcd.print("Turn Left");
} else if (turnRight) {
Serial.println("Turn right");
motorControl.moveCar(MOVE, TURN_RIGHT, 50);
delay(100);
motorControl.stopAllMotors();
lcd.print("Turn Right");
Turning moves fairly quickly (in retrospect perhaps I should have used a smaller value for the speed) which can be a problem when using the face sensing so I opted to only turn for 100ms at a time giving the sensor a chance to catch the face again after the move.
} else if (moveForward) {
lcd.print("Action: Moving");
lcd.setCursor(0, 1);
if (moveLeft) {
Serial.println("Moving left forward");
motorControl.moveCar(MOVE, LEFT_FORWARD, 50);
lcd.print("Left Forward");
} else if (moveRight) {
Serial.println("Moving right forward");
motorControl.moveCar(MOVE, RIGHT_FORWARD, 50);
lcd.print("Right Forward");
} else {
Serial.println("Moving forward");
motorControl.moveCar(MOVE, FORWARD, 50);
lcd.print("Forward");
}
For the forward and backward (not shown here) movement the logic determines if it should also move left or right and if so moves in that direction.
} else if (moveLeft) {
lcd.print("Action: Moving");
lcd.setCursor(0, 1);
Serial.println("Moving left");
motorControl.moveCar(MOVE, LEFT, 50);
lcd.print("Left");
} else if (moveRight) {
lcd.print("Action: Moving");
lcd.setCursor(0, 1);
Serial.println("Moving right");
motorControl.moveCar(MOVE, RIGHT, 50);
lcd.print("Right");
} else {
Serial.println("Stopping all motors");
motorControl.stopAllMotors();
lcd.print("Action: Stopped");
lcd.setCursor(0, 1);
}
If there’s no movement forward or backwards it strafes in that direction and if there’s no movement at all it stops the motors.
Conclusion and Future ThoughtsFace control works fairly well for the robot. I think perhaps I could continue to improve the movement code to better keep the face in the view port. I think if I increased the height of the vision camera it could track adults a bit better as well. Obviously it's face tracking so you need to stay looking at it but it makes for a fun toy to play with the kids as they really enjoy interacting with it.
The Seeed Vision AI Module v2 was also a nice product to work with. Flashing the built in models was fairly easy and I have already flashed my own custom yolo trained model with no issue as well (using SSCMA examples to train).
Comments