1 Tennis Picker
Tennis Picker is automatic tennis picking robotic arm with embedded Machine Learning engine within the PSoC 62S2 core, the gesture by Accelerator Sensor can be detected by ML model from Infineon AI engine. The result is read and used to control 3 DC motors for move forward/backward and robotic arm picking tennis.
This core part of the Project is Modus toolbox IDE to be used to coding. ML configurator, which can run Machine Learning Model based on tensorflow lite and gesture detection. The previous ideo of Video detection is uncompitable with the size of memory, since the minimal Objective Detection model is MobilleNet is 3.42M and can not pass the memory test in ML configurator. So, gesture detection is used instead.
2 Hardware and Software
- One Seeed 3-Accelerator sensor grove
- Three 6V DC motor sets with drivers to move the frame holding Robotic arm.
- Hardware platform and Robotic arm.
- Battery
- Infineon Motor driver BT7002
- PSoC™ 62S2 Wi-Fi BT Pioneer Kit, core of the project for camera picture taking and object detection.
- Modus IDE shall be used with device configurator, ML configurator
3 Code and build
Start Modus IDE and configurate pins for three motor control
The pins shall be set in Device configurator
Since the BT7002 Driver for arduino pins is,
Then connect the Seeed 3-Accelerator sensor grove with PSoC™ 62S2 Wi-Fi BT Pioneer Kit
Start the code with freeRTOS start Gesture tasks
void gesture_task(void *arg)
{
#if !GESTURE_DATA_COLLECTION_MODE
/* Regression pointers */
MTB_ML_DATA_T *input_reference;
#endif
/* Data processed in floating point */
float data_feed[SENSOR_BATCH_SIZE][SENSOR_NUM_AXIS];
(void)arg;
/* Initialize the butter-worth filter variables */
int n_order = 3;
/* Coefficients for 3rd order butter-worth filter */
const float coeff_b[] = IIR_FILTER_BUTTER_WORTH_COEFF_B;
const float coeff_a[] = IIR_FILTER_BUTTER_WORTH_COEFF_A;
iir_filter_struct butter_lp_fil;
for(;;)
{
uint16_t cur = 0;
int16_t temp_buffer[SENSOR_BATCH_SIZE][SENSOR_NUM_AXIS];
/* Get sensor data */
sensor_get_data((void *) temp_buffer);
/* Cast the data from an int16 to a float for pre-processing */
cast_int16_to_float(&temp_buffer[0][0], &data_feed[0][0], SENSOR_BATCH_SIZE*SENSOR_NUM_AXIS);
/* Third order butter-worth filter */
while(cur < SENSOR_NUM_AXIS)
{
/* Initialize and run the filter */
iir_filter_init(&butter_lp_fil, coeff_b, coeff_a, n_order);
iir_filter(&butter_lp_fil, &data_feed[0][0], SENSOR_BATCH_SIZE, cur, SENSOR_NUM_AXIS);
cur++;
}
/* A min max normalization to get all data between -1 and 1 */
normalization_min_max(&data_feed[0][0], SENSOR_BATCH_SIZE, SENSOR_NUM_AXIS, MIN_DATA_SAMPLE, MAX_DATA_SAMPLE);
#ifdef CY_BMI_160_IMU_I2C
/* Swap axis for BMI_160 so board orientation stays the same */
column_inverse(&data_feed[0][0], SENSOR_BATCH_SIZE, SENSOR_NUM_AXIS, 2);
column_swap(&data_feed[0][0], SENSOR_BATCH_SIZE, SENSOR_NUM_AXIS, 0, 1);
column_inverse(&data_feed[0][0], SENSOR_BATCH_SIZE, SENSOR_NUM_AXIS, 5);
column_swap(&data_feed[0][0], SENSOR_BATCH_SIZE, SENSOR_NUM_AXIS, 3, 4);
#endif
#if GESTURE_DATA_COLLECTION_MODE
cur = 0;
printf("-,-,-,-,-,-\r\n");
while (cur < SENSOR_BATCH_SIZE)
{
printf("%6f,%6f,%6f,%6f,%6f,%6f\r\n", data_feed[cur][0],
data_feed[cur][1],
data_feed[cur][2],
data_feed[cur][3],
data_feed[cur][4],
data_feed[cur][5]);
cur++;
}
#else
#if !COMPONENT_ML_FLOAT32
/* Quantize data before feeding model */
MTB_ML_DATA_T data_feed_int[SENSOR_BATCH_SIZE][SENSOR_NUM_AXIS];
mtb_ml_utils_model_quantize(magic_wand_obj, &data_feed[0][0], &data_feed_int[0][0]);
/* Feed the Model */
input_reference = (MTB_ML_DATA_T *) data_feed_int;
mtb_ml_model_run(magic_wand_obj, input_reference);
control(result_buffer, model_output_size);
#else
input_reference = (MTB_ML_DATA_T *) data_feed;
mtb_ml_model_run(magic_wand_obj, input_reference);
control(result_buffer, model_output_size);
#endif
#endif /* #if GESTURE_DATA_COLLECTION */
}
}
and start case-switch in the control
switch (class_index)
{
case 0:
printf("%s\r\n", gesture_one);
//const char gesture_one[] = "Circle";
cyhal_gpio_write(CYBSP_D9, CYBSP_LED_STATE_ON);
motor(1);
break;
case 1:
printf("%s\r\n", gesture_two);
cyhal_gpio_write(CYBSP_D10, CYBSP_LED_STATE_ON);
//const char gesture_two[] = "Side-to-Side";
motor(2);
break;
case 2:
printf("%s\r\n", gesture_three);
cyhal_gpio_write(CYBSP_D11, CYBSP_LED_STATE_ON);
//const char gesture_three[] = "Square";
motor(3);
break;
case 3:
printf("%s\r\n", gesture_four);
//cyhal_gpio_write(CYBSP_D3, CYBSP_LED_STATE_ON);
//const char gesture_four[] = "negative";
motor(4);
break;
}
3 Put it running
Stack the Motor driver board on PSoC™ 62S2 Wi-Fi BT Pioneer Kit
build the project and flash the binary into the board
The code works and make it OK.
The project is simple demo of how ML is used in PSoC™ 62S2 Wi-Fi BT Pioneer Kit, but complex project with larger model is still challenging task for further exploration.
Comments