The device is based on a DevBoard on which the developer's software is installed. The board is connected to the controlled device via Bluetooth and transmits recognised voice commands and its tilt angle. The user holds this "remote control" in their hand and uses tilt and voice commands to control the robot or other devices that he adapts to work with the board, which I hope this article will help.
There are two microcontrollers on the board, ESP32-S3 and ESP32-C3.
S3 is already flashed, and the robot control program starts by default when the device is turned on. There is no access to the source code and no way to cross-flash S3.
Interaction and the ability to change anything on the controller itself are only available on the C3 part. The firmware source code for it is open source and can be found here (for those who want to understand more deeply or modify the interaction mechanism).
The first method: the board works in voice command recognition mode for selecting fixed commands.
- ROBOT WAKE UP
- ROBOT SLEEP
- MANUAL CONTROL
- VOICE CONTROL
- GO RIGHT
- GO LEFT
- GO FORWARD
- GO BACK
- GO HOME
- SLOWER SPEED
- FASTER SPEED
- LIGHTS ON
- LIGHTS OFF
- PLAY MUSIC
The second control method is to control the tilt change of the board in the plane. In this case, real-time values of the tilt angles in the X and Y axes are sent to the controlled device, which can be converted into motion control commands.
To switch to the manual control mode, say the MANUAL CONTROL command and the board screen will change.
While voice commands are also still recognized and sent, you can continue to process them on the controlled device.
Ready implementation for CrowBOT BOLTThe easiest way to get started is to buy the control module and Elecrow's CrowBOT. The DevBoard developer has added factory firmware for it, including blocks to interact with the development board.
As I wrote above, the developer has already realized the possibility of interaction with the board βout-of-the-box", but for a specific device - CrowBOT Bolt from Elecrow. This is a training robot that can be programmed in Arduino IDE, Letscode and MicroPython.
The factory firmware for the robot from the manufacturer is available on the Elecrow wiki, and it already implements the IR remote control that comes with the robot.
We are interested in comparing the robot's original firmware from the manufacturer with the new firmware from the controller board developer to understand the code and use it as a basis for developing a program for any other device in our projects.
The new firmware can be found here: https://github.com/Grovety/CrowBot_GRC_program
Main firmware changes- Connecting to the DevBoard via Bluetooth
In the original firmware of the robot there is a possibility to work with Bluetooth, because the robot can be equipped with a Bluetooth joystick for control. But we are interested in how the connection to our board is implemented.
Response to voice commands
Handling of events coming from the board and related to voice control
- DevBoard tilt control of the robotThe same applies to tilt control of the board.
- Audible signal when driving backwards
The developer added useful little things to make the firmware more interesting. For example, the robot starts to emit a warning sound at every reverse movement like many cars today
- Backlighting at turns (turn signals)
Also added is backlighting at turns, which makes using CrowBOT more colorful and exciting.
Let's take a closer look at the changesAt the very beginning of the program header files of libraries that allow to organize multitasking are linked:
13 β #include <freertos/FreeRTOS.h>
14 β #include <freertos/message_buffer.h>
15 β #include <freertos/projdefs.h>
These libraries provide for creating and managing tasks, as well as messaging between them. In the future, this will help in playing audio and processing commands from the DevBoard.
Next, the Bluetooth UUID of the DevBoard is entered:
29 β static BLEUUID serviceUUID("FFE0"); //Host service UUID
30 β static BLEUUIDcharUUID("FFE1");
This specifies the device to which our robot will connect.
83 β SemaphoreHandle_t xPlayMusicSemaphore;
84 β SemaphoreHandle_t xDisconnectedSemaphore;
85 β MessageBufferHandle_t xGrcCmdBuffer;
Declaring semaphores and a message buffer to control the execution of tasks and synchronization between them.
What is a semaphore?
It is easier to think of a semaphore as a plate with one candy. Tasks can check for the candy, try to take it, and return it after a while. It turns out to be just a bool, but it works a bit more complicated due to multithreading and related issues. It's also worth clarifying that besides binary semaphores (when there is one candy), there are non-binary semaphores (multiple candies).
In our case, we will use semaphores to communicate between tasks:
125 β if (xSemaphoreTake(xPlayMusicSemaphore, portMAX_DELAY) == pdTRUE)
For example, this line means "Wait as long as possible for a candy to appear on the plate, and as soon as it appears, take it immediately".
258 β xSemaphoreGive(xPlayMusicSemaphore);
And that line means: βPut the candy back where it belongsβ.
87 β #define MAX_GRC_MSG_LEN 20
Define the maximum length of a message from DevBoard.
89 β #define MOTOR_MIN_VALUE 30
90 β #define MOTOR_MAX_VALUE 255
The interval of absolute values supplied to the motors. Without this limit, the robot's motors just hum but don't work. Adding a minimum speed helps to eliminate noise at low power.
92 β template<classT>
93 β constexpr const T& clamp(const T&v, const T&lo, const T&hi)
94 β {
95 β returnstd::less<T>{}(v, lo)? lo : std::less<T>{}(hi,v) ? hi : v;
96 β }
The clamp function allows you to limit the range of change of some value by setting its lower and upper limits.
98 β void poly_transform(const float params[12], float x, float y, float *_x, float *_y) {
99 β *_x = params[0] + params[1] * x +params[2] *y +
100 β params[3] * powf(x, 2) +params[4] *x * y +
101 β params[5] * powf(y, 2);
102 β *_y =params[6 + 0] +params[6 + 1] *x + params[6 + 2] *y +
103 β params[6 + 3] * powf(x, 2) +params[6 + 4] *x * y +
104 β params[6 + 5] * powf(y, 2);
105 β }
This function performs a polynomial coordinate transformation. This is used to convert the tilt angles received from the DevBoard into the powers supplied to the motors. Note that this transformation is non-linear. As mentioned before, the working power interval for the motors is [-255... 255], and the received angle is in the interval [-45... 45].
If we take 3 three-dimensional points for each motor (roll, pitch and expected power value) and interpolate, we can get a formula for this transformation. For example, for the left motor we take the points:
(45, 45, 256) - DevBoard is turned forward and to the right, so we should turn the left motor at maximum power to turn to the right.
(-22.5, 45, 128) - DevBoard is facing forward and slightly left, meaning we should move forward but also turn left, so expect half motor power.
(0, 0, 0, 0) - no tilt - no movement.
If we perform a bilinear interpolation, we obtain the formula for the left motor: l = 1.3274074074074107x+3.792592592592592593y+0.012641975308641975xy
The values are similar for the right motor.
107 β void led_callback() //Callback function
108 β {
109 β FastLED.show();
110 β }
112 β static bool moving_backwards = false;
113 β voidbuzzer_callback() //Callback function
114 β {
115 β if (moving_backwards) {
116 β ledcWriteTone(2,G4); //Buzzer
117 β delay(150);
118 β ledcWrite(2, 0);
119 β }
120 β }
The led_callback and buzzer_callback procedures are used to update the state of the LEDs and control the buzzer respectively. They will be called at certain intervals.
122 β void play_music_task(void*arg)
123 β {
124 β for (;;) {
125 β if (xSemaphoreTake(xPlayMusicSemaphore, portMAX_DELAY) ==pdTRUE)
126 β {
127 β if (!moving_backwards) {
128 β for (int i = 0; i < Tone_length;i++) { //Buzzer
129 β ledcWriteTone(2, Tone[i]); //Buzzer
130 β delay(Time[i] * 1000);
131 β }
132 β ledcWrite(2, 0); //trun off Buzzer
133 β }
134 β }
135 β }
136 β }
A task that constantly tries to capture a semaphore and, if it succeeds, plays music. The Tone array contains the frequencies to be played, and the Time array contains the time at which a particular note is played. The semaphore is released in the main task, allowing the PLAY MUSIC voice command to be executed asynchronously.
And here is the main task itself:
138 β void grc_cmd_task(void*arg)
139 β {
140 β const floatd_speed = 0.5;
Changing the velocity factor (for FASTER SPEED and SLOWER SPEED voice commands)
141 β float speed = 1;
Velocity factor itself (for GO FORWARD and GO BACK commands)
142 β bool imu_control_mode =false;
Whether we are in tilt control mode:
143 β auto init_imu_control_state = [&imu_control_mode]() {
144 β Serial.printf("Enable imu control mode\n");
145 β imu_control_mode = true;
146 β ticker.attach_ms(300, led_callback); //lighting task
147 β ticker1.attach_ms(900, buzzer_callback); //buzzer task
148 β };
When we switch to tilt control mode, we start periodically calling led_callback and buzzer_callback.
149 β auto release_imu_control_state = [&imu_control_mode, &moving_backwards]() {
150 β Serial.printf("Disable imu control mode\n");
151 β imu_control_mode = false;
152 β moving_backwards = false;
153 β Motor(0, 0, 0, 0);
154 β ledcWrite(2, 0);
155 β fill_solid(leds, 4, CRGB::Black);
156 β FastLED.show();
157 β ticker.detach();
158 β ticker1.detach();
159 β };
When switch to voice control mode, we turn off motors and LED and stop periodic call of led_callback and buzzer_callback:
160 β auto front_lights_cmd = [](bool front_lights_en) {
161 β uint8_t val = 0;
162 β if (front_lights_en) {
163 β FastLED.setBrightness(255); //RGB lamp brightness range: 0-255
164 β val = 255;
165 β }
166 β myRGBcolor6.r = val;
167 β myRGBcolor6.g = val;
168 β myRGBcolor6.b = val;
169 β fill_solid(RGBleds, 6, myRGBcolor6);
170 β FastLED.show();
171 β };
Turn on (front_lights_en == true) or turn off (front_lights_en == false) the βheadlightsβ
173 β char msg[MAX_GRC_MSG_LEN];
Message buffer
174 β for (;;) {
175 β size_t length = xMessageBufferReceive(xGrcCmdBuffer, &msg,MAX_GRC_MSG_LEN,portMAX_DELAY);
176 β std::string value(msg, length);
Expect a message from the buffer. Once received, translate it into a string.
177 β conststd::string::size_type coords_offset = value.find("XY");
178 β const bool recv_imu_coords = coords_offset != std::string::npos;
179 β if (recv_imu_coords && !imu_control_mode) {
180 β init_imu_control_state();
181 β }
The presence of XY in the message signals the receipt of data from the tilt sensor. If we are not in tilt control mode and we have received XY, switch to it.
183 β //********************************GRC voice command******************************************
184 β if (!imu_control_mode) {
185 β Serial.printf("Its characteristic value is: %s\n", value.c_str());
186 β if (value == "GO FORWARD") //forward
187 β {
188 β leds[2] = CRGB::Green;
189 β leds[3] = CRGB::Green;
190 β fill_solid(RGBleds, 6, myRGBcolor6);
191 β FastLED.show();
192 β Motor(float(160) * speed, 0, float(160) *speed, 0);
193 β delay(600);
194 β Motor(0, 0, 0, 0);
195 β fill_solid(leds, 4, CRGB::Black);
196 β FastLED.show();
197 β }
198 β
199 β if (value == "GO BACK") //backward
200 β {
201 β leds[0] = CRGB::Red;
202 β leds[1] = CRGB::Red;
203 β fill_solid(RGBleds, 6, myRGBcolor6);
204 β FastLED.show();
205 β Motor(0, float(160) * speed, 0, float(160) *speed);
206 β delay(600);
207 β Motor(0, 0, 0, 0);
208 β fill_solid(leds, 4, CRGB::Black);
209 β FastLED.show();
210 β }
211 β
212 β if (value == "GO RIGHT") //towards the right
213 β {
214 β leds[2] = CRGB::Green;
215 β fill_solid(RGBleds, 6, myRGBcolor6);
216 β FastLED.show();
217 β Motor(80, 0, 0, 80);
218 β delay(350);
219 β Motor(0, 0, 0, 0);
220 β fill_solid(leds, 4, CRGB::Black);
221 β FastLED.show();
222 β }
223 β
224 β if (value == "GO LEFT") //towards the left
225 β {
226 β leds[3] = CRGB::Green;
227 β fill_solid(RGBleds, 6, myRGBcolor6);
228 β FastLED.show();
229 β Motor(0, 80, 80, 0);
230 β delay(350);
231 β Motor(0, 0, 0, 0);
232 β fill_solid(leds, 4, CRGB::Black);
233 β FastLED.show();
234 β }
235 β
236 β if (value == "FASTER SPEED") // change speed
237 β {
238 β speed = std::min(1.5f, speed +d_speed);
239 β }
240 β
241 β if (value == "SLOWER SPEED") // change speed
242 β {
243 β speed = std::max(0.5f, speed -d_speed);
244 β }
245 β
246 β if (value == "LIGHTS ON") // enable lights
247 β {
248 β front_lights_cmd(true);
249 β }
250 β
251 β if (value == "LIGHTS OFF") // disable lights
252 β {
253 β front_lights_cmd(false);
254 β }
255 β
256 β if (value == "PLAY MUSIC") // toggle music
257 β {
258 β xSemaphoreGive(xPlayMusicSemaphore);
259 β }
260 β
261 β if (value == "MANUAL CONTROL") // enable imu_control_mode
262 β {
263 β init_imu_control_state();
264 β }
We execute commands in voice mode.
265 β } else {
266 β if (value == "VOICE CONTROL") // disable imu_control_mode
267 β {
268 β release_imu_control_state();
269 β }
270 β if (value == "LIGHTS ON") // enable lights
271 β {
272 β front_lights_cmd(true);
273 β }
274 β if (value == "LIGHTS OFF") // disable lights
275 β {
276 β front_lights_cmd(false);
277 β }
278 β if (value == "PLAY MUSIC") // toggle music
279 β {
280 β xSemaphoreGive(xPlayMusicSemaphore);
281 β }
And now the processing of voice commands in tilt control mode.
Let's move on to the tilt angle processing:
282 β //********************************GRC imucommand******************************************
283 β if (recv_imu_coords) {
284 β int8_t ix, iy;
285 β memcpy(&ix, &value.c_str()[coords_offset + 2], sizeof(int8_t));
286 β memcpy(&iy, &value.c_str()[coords_offset + 3], sizeof(int8_t));
We get 2 bytes from the message: X and Y angle
287 β
288 β float x,y;
289 β x = clamp(float(ix), -45.f, 45.f);
290 β y = clamp(float(iy), -45.f, 45.f);
We limit the values in the interval [-45... 45]
291 β
292 β static const float params[] = {
293 β 1.689410597460024e-14, 1.3274074074074107, 3.792592592592593,
294 β -1.0409086843526395e-17, 0.012641975308641975, -2.995326321410946e-18,
295 β 3.215022330213294e-08, -1.3274074134624825, 3.7925925925925945,
296 β 8.344758175431135e-18, -0.012641975308641976, -2.3814980193096928e-11};
297 β float l,r;
298 β poly_transform(params,x, y, &l, &r);
Converting tilt angles to motor values.
300 β if (l <= MOTOR_MIN_VALUE && l >= -MOTOR_MIN_VALUE) {
301 β l = 0;
302 β }
303 β if (r <= MOTOR_MIN_VALUE && r >= -MOTOR_MIN_VALUE) {
304 β r = 0;
305 β }
If the values supplied to the motors are small enough, we simply set them to zero. This is essentially just the βnoise cancellerβ for the motors that was described above.
This may need a little bit of explanation about the robot's βheadlightsβ:
In addition to the RGB leds (high beam), there is also an array of 4 headlights:
leds[0] - rear right
leds[1] - rear left
leds[2] - front right
leds[3] - front left
307 β fill_solid(leds, 4, CRGB::Black);
By default, the headlights are off.
308 β int lf,lb, rf, rb;
309 β if (r < 0.f) {
310 β rf = 0;
311 β rb = std::min(int(-r), MOTOR_MAX_VALUE);
312 β } else {
313 β rf = std::min(int(r), MOTOR_MAX_VALUE);
314 β rb = 0;
315 β myRGBcolor.r = 0;
316 β myRGBcolor.g = rf;
317 β myRGBcolor.b = 0;
318 β leds[3] = myRGBcolor;
319 β }
We limit the power of the motors, if moving forward/left, we turn on the front turn signal.
320 β if (l < 0.f) {
321 β lf = 0;
322 β lb = std::min(int(-l), MOTOR_MAX_VALUE);
323 β } else {
324 β lf = std::min(int(l), MOTOR_MAX_VALUE);
325 β lb = 0;
326 β myRGBcolor.r = 0;
327 β myRGBcolor.g = lf;
328 β myRGBcolor.b = 0;
329 β leds[2] = myRGBcolor;
330 β }
Similarly for the left motor
331 β if (r < 0.f && l < 0.f) {
332 β myRGBcolor.r = rb;
333 β myRGBcolor.g = 0;
334 β myRGBcolor.b = 0;
335 β leds[0] = myRGBcolor;
336 β myRGBcolor.r = lb;
337 β myRGBcolor.g = 0;
338 β myRGBcolor.b = 0;
339 β leds[1] = myRGBcolor;
340 β }
When reversing, we turn on the reverse signal
341 β Motor(lf, lb,rf, rb);
342 β if (l < -MOTOR_MIN_VALUE && r < -MOTOR_MIN_VALUE) {
343 β moving_backwards =true;
344 β } else {
345 β moving_backwards =false;
346 β }
347 β }
348 β }
349 β }
350 β }
Finally, we tell the motors their power.
This procedure is responsible for receiving messages from the control device.
352 β static void NotifyCallback(BLERemoteCharacteristic* pBLERemoteCharacteristic, uint8_t*pData, size_t length, bool isNotify) {
353 β xMessageBufferSend(xGrcCmdBuffer,pData, length, 0);
354 β }
Adds a message to the buffer that is processed by the main task.
On loss of Bluetooth communication
366 β // TODO: stop all activity on disconnect
367 β Motor(0, 0, 0, 0);
368 β xSemaphoreGive(xDisconnectedSemaphore);
we turn off the motors and release the semaphore.
715 β xPlayMusicSemaphore = xSemaphoreCreateBinary();
716 β xDisconnectedSemaphore = xSemaphoreCreateBinary();
717 β xGrcCmdBuffer = xMessageBufferCreate(MAX_GRC_MSG_LEN * 5);
718 β xTaskCreate(play_music_task, "play_music_task", 1024, NULL, 1, NULL);
719 β xTaskCreate(grc_cmd_task, "grc_cmd_task", 4 * 1024, NULL, 1, NULL);
Creating semaphores, buffers and tasks
At the connection to the device, we initialize the lights and motors with default values:
764 β FastLED.setBrightness(255); //RGB lamp brightness range: 0-255
765 β fill_solid(leds, 4, CRGB::Black);
766 β fill_solid(RGBleds, 6, CRGB::Black);
767 β FastLED.show();
768 β Motor(0, 0, 0, 0);
769 β ledcWrite(2, 0);
When connecting, we try to capture the device disconnect semaphore.
780 β xSemaphoreTake(xDisconnectedSemaphore, portMAX_DELAY);
Basically, we expect a disconnect signal. And when we get it, we continue looking for devices.
ConclusionLet's summarize the discussed changes in the firmware:
1. Addition of multithreading for parallel processing of commands regardless of their reception, as well as asynchronous playback of sounds
2. Two control modes with different capabilities
3. Polynomial conversion of DevBoard rotation angle to motor power values
As you can see from the article, this did not come without some changes to proprietary sources. The added functionality is very different from the standard functionality, so there are quite a few changes. And if someone finds something interesting here, they can use it in their projects.
In particular, if you need voice control, all you need to do is receive messages from Bluetooth. Then, just process the voice commands by simple string comparison; the rest is already implemented in the code above. Now, for tilt angle processing, all you need to do is take 2 and 3 bytes of the source message and convert them to a number with a sign. These will be the X and Y rotation angles.
I hope this analysis helps those who want to use a control board for their equipment.
Also, it's worth mentioning that you can build other projects with the same dev board.
For example, a voice-activated PIN lock:
https://www.hackster.io/Grovety/autonomous-voice-activated-electromagnetic-lock-96ed9d
Comments
Please log in or sign up to comment.