The main goal of this robot is to search any object it detects with an ultrasonic sensor and charge at it. The robot has an ultrasonic sensor mounted on the top of the robot, with it connects to the GPIO pins on the orange board. The robot will be acting in three states: patrolling state, charging state, and backing state. The switches between these states are triggered by the distance ultrasonic sensor detect.
Ultrasonic SensorThe ultrasonic sensor detects the distance of an object by firing a sonic wave and wait for it to bounce back. Then we will count how long it takes for the sensor to detect the wave that bounces back and multiply by the speed of sound to detect the distance. The ultrasonic sensor is good at detecting a flat surface that is parallel to the sensor, however, when the object is slanted, the sound wave won’t bounce back thus the sensor would output the highest value. Instead of using a box as a demo object, I choose to use a cylinder target which will make the sensor detect it at all angles.
Additionally, the sensor data we get is very noisy sometimes. The echo distance could spike to a very high number which causes a lot of false action since every action of the robot depends on the echo distance. In order to fix this, I implemented a 5thorder median filter. Inside of the ecap1 interrupt function, I saved 5 passed state of echo distance, then I use a helper function that sorts the value of the value array and takes the median value. This filter significantly improved the robot's performance.
States of the RobotAll states of the robot are saved inside of CPU timer 2, where it’s called every 12 milliseconds. Since one cycle of the ultrasonic sensor trigger process takes 12 milliseconds, I want the command to be in sync with how fast the echo distance is updated. There is a timer for each state, control its actions.
The default state is the patrolling state. The patrolling state triggers when the ultrasonic distance is greater than 100cm. The patrolling state consists of two actions, rotating and stepping forward. There’s a timer for these actions, and for the first 400 counts (4.8 seconds), the robot would slowly turn to the left side. Then after 400 counts, the robot will be driving forward for 100 counts (1.2 seconds). The process repeats until the robot can find a target within 100cm.
The second state is straightforward. If the ultrasonic detects an object that’s within the range of 15 cm to 70 cm, the robot will decide if it should charge or not. If the robot decides it’s a wall, it will avoid it and go back to the patrol state, and if the robot decides it’s a target, the robot will charge at it. To make the decision, the robot would first record the echo distance when it first detects the object, then keeps turning left for 70 counts and record the second echo distance. If the difference between two records is greater than 40 cm, the robot would recognize it as a target, and turn right for 70 counts, and charge at it. If the difference between the two records is less than 40 cm, it indicates the object is a wall, so it will keep turning left and drive away. This is the most complicated part of this project, not because of the coding, but the nature of the ultrasonic sensor. As I mentioned above, the ultrasonic sensor is horrible at detecting slanted surfaces. So, when the robot is turning to record the second echo distance, if the robot turned too much, the wall would become too slanted, and the second echo distance would be max distance since sonic cannot bounce back. When that happens, since the difference between the two records is large so the robot will decide the object is a target and charge at the wall. And if the robot turns too little, the robot might still detect the target and decide the target is a wall. Also, if I make the target smaller, it will help the robot to better recognize the target, but it also makes it hard for the robot to detect the object. In conclusion, the robot would be good at recognizing the wall or target or not very good at recognizing either of them. After numerous trials and errors, including changing the turn angle of the robot, changing the target size, changing the timer distance, and detect distance, I finally got a fairly consistent result.
The ‘tusk’ I mounted on the robot is about 13 cm, so once the robot makes an impact, the sensor will detect the distance is less than 15 cm, which will trigger the backing state. The backing state is straightforward, the robot would simply back out for 80 counts (0.96 seconds).
SteeringI used codes from lab 6 for the steering control of the robot. The steering code is inside CPU timer 1, which is also called once every 12 milliseconds. I implemented the decoupled PI controller on motors, where it integrates the errors using the trapezoid rule and adds them to the gain.
Comments
Please log in or sign up to comment.