Due to the spread of the new coronavirus (COVID-19), people are feeling resistant to things that an unspecified number of people touch,
In the future, from the viewpoint of preventing the spread of infection, the need for non-contact operation will increase compared to touch panels. This non-contact operation can also use instead of keyboard and mouse.
My solution is to use a motion sensor for detecting the movement of human hands. This movement shows what to do using the processing of AI / Machine Learning (ML) solutions.
2. How I am Trying to Solve It?In the gesture recognition system, the PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit inputs data from the motion sensor in IoT Sense Expansion Kit, running a gesture recognition AI model.
SensiML Analytics Toolkit trains a gesture recognition model using the motion sensor (BMX160) data in IoT Sense Expansion Kit. After training, the SensiML Analytics Toolkit uses labeled datasets to create a gesture recognition model.
ModusToolbox™ Machine Learning imports the gesture recognition model from SensiML Analytics Toolkit. ModusToolbox™ Machine Learning optimizes the model for PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit, generating the optimized model code.
PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit implements the created gesture recognition AI model, recognizing the movement of human hands, for example, up / down or left /right. PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit sends the recognition results to Android smartphones using BLE (Bluetooth Low Energy).
Android smartphone receives recognition results, moving smartphone screen following the movement of human hands. For example, when human hands move up and down, the smartphone screen scrolls vertically. When human hands move left and right, the smartphone screen scrolls horizontally.
3. PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit Set UpPSoC™ 6 62S2 Wi-Fi BT Pioneer Kit mounts on IoT Sense Expansion Kit as follows.
The gesture recognition system program code on PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit bases ‘Bluetooth_LE_CAPSENSE_Buttons_and_Slider’ and ‘SensiML_Template_Firmware’ in ModusToolbox sample code. Those source codes show in ’https://github.com/tomosoft-jp/Gesture-Recognition.‘
4.1 BLE/ IoT Sense Expansion Kit Set upConfirm service properties UUID of BLE. By Double-clicking the ‘design.cybt’ file, Display the BLE setting as follows. Confirm service properties UUID is ‘0003CAB5-0000-1000-8000-00805F9B0131.’
To include program code for IoT Sense Expansion Kit, check ‘CY8CKIT-028-SENSE’ using Library Manager as follows. ModusToolbox adds ‘CY8CKIT-028-SENSE’ library into the code.
Sample code ‘Bluetooth_LE_CAPSENSE_Buttons_and_Slider’ and ‘SensiML_Template_Firmware’ modify following the steps below.
‘sml_output_results’ function in ‘sml_recognition_run.c’ displays the recognition results by text format included ‘Classification.’ ‘sml_getclassification’ function converts the recognition results text into integer data type.
‘source/lib/sml_recognition_run.c’
…
extern char classification; // tomo Stationary:2 Vertical:3 Horizontal:1
void sml_getclassification(char *pbuf) {
char *adr1 = strstr(pbuf, "Classification");
// printf("%x\r\n", *(adr1 + 16));
classification = *(adr1 + 16) - '0';
}
void sml_output_results(int model_index, int model_result) {
//bool feature_vectors = true;
//int size = 0;
kb_print_model_result(model_index, model_result, str_buffer, 1, fv_arr);
sml_getclassification(str_buffer); // tomo
// printf("%s\r\n", str_buffer);
}
‘task_ble’ function in ‘ble_task.c’ repeatedly calls ‘ble_app_send_notification_classification ‘ function to send the recognition results to Android smartphone using BLE notify protocol.
‘source/ble_task.c’
…
bool button1_flg; // tomo button1 status 1:True
…
void task_ble(void *param) {
BaseType_t rtos_api_result = pdFAIL;
wiced_result_t result = WICED_BT_SUCCESS;
/* Suppress warning for unused parameter */
(void) param;
/* Configure platform specific settings for the BT device */
cybt_platform_config_init(&cybsp_bt_platform_cfg);
/* Register call back and configuration with stack */
result = wiced_bt_stack_init(ble_app_management_cb, &wiced_bt_cfg_settings);
/* Check if stack initialization was successful */
if ( CY_RSLT_SUCCESS == result) {
printf("Bluetooth stack initialization successful!\r\n");
} else {
printf("Bluetooth stack initialization failed!\r\n");
CY_ASSERT(0);
}
/* Repeatedly running part of the task */
for (;;) {
/* Block until a command has been received over queue */
rtos_api_result = xQueueReceive(ble_capsense_data_q, &ble_capsense_data,1000 / portTICK_PERIOD_MS); // tomo
/* Command has been received from queue */
if (pdPASS == rtos_api_result) {
ble_app_send_notification();
} else {
// printf("task_ble classification: %d\n", classification); // tomo
ble_app_send_notification_classification();
}
}
} …
‘ble_app_send_notification_classification’ function in ‘ble_task.c’ sends the direction data to Android smartphone. The direction data decides by the recognition results and the CapSense button.
‘source/ble_task.c’
…
// tomo stay:0 up:1 down:2 left:3 right:4
// tomo Stationary:2 Vertical:3 Horizontal:1
void ble_app_send_notification_classification(void) { // tomo
wiced_bt_gatt_status_t status = WICED_BT_GATT_ERROR;
char direction;
if ((GATT_CLIENT_CONFIG_NOTIFICATION
== app_capsense_button_client_char_config[0])
&& (0 != ble_connection_id)) {
if (classification == 2) {
direction = 0;
} else if ((classification == 3) && !(button1_flg)) {
direction = 1;
} else if ((classification == 3) && (button1_flg)) {
direction = 2;
} else if ((classification == 1) && !(button1_flg)) {
direction = 3;
} else {
direction = 4;
}
printf("direction: %d\n", direction);
app_capsense_button[0] = direction;
app_capsense_button[1] = 0;
app_capsense_button[2] = 0;
status = wiced_bt_gatt_server_send_notification(ble_connection_id,
HDLC_CAPSENSE_BUTTON_VALUE, app_gatt_db_ext_attr_tbl[2].cur_len,
app_gatt_db_ext_attr_tbl[2].p_data, NULL);
if (WICED_BT_GATT_SUCCESS != status) {
printf("Sending CapSense button notification failed\r\n");
}
}
}
4.3 Data Collection CodeChange to data collection mode by ‘app_config.h’ for the ModusToolbox project.
‘source/app_config.h’
/******************************************************************************
* Constants
*****************************************************************************/
// Running Modes
// 1 = DATA CAPTURE => Use this mode for collecting data and use the Data Capture Lab
// 2 = RECOGNITION => Use this mode for running a Knowledge pack from the sensor
#define DATA_CAPTURE_RUNNING_MODE 1
#define RECOGNITION_RUNNING_MODE 2
// Change the below to either DATA_CAPTURE_RUNNING_MODE (or) RECOGNITION_RUNNING_MODE
#define APPLICATION_RUNNING_MODE DATA_CAPTURE_RUNNING_MODE
// Type of Datacapture -
// 1 = SENSOR_MOTION
// 2 = SENSOR_AUDIO
#define SENSOR_MOTION 1
#define SENSOR_AUDIO 2
// Change the below to either SENSOR_MOTION (or) SENSOR_AUDIO
#define SENSOR_SELECT_MODE SENSOR_MOTION
// Motion sensor valid sample rates
#define MOTION_SAMPLE_RATE_400HZ 400
4.4 Data Recognition Code with Knowledge PackExtract the downloaded file, and see a knowlegepack_project and sensiml folders as follows.
C:.
│ model.json
│
├─knowledgepack_project
│ app_config.h
│
└─sensiml
├─inc
│ kb.h
│ kb_debug.h
│ kb_defines.h
│ kb_typedefs.h
│ model_json.h
│ testdata.h
│
├─lib
│ libsensiml.a
│
└─src
Copy all of the files in ‘senisml/lib’ and ‘sensiml/inc’ over to the ‘source/lib’ folder in the ModusToolbox project. The knowledgepack_project folder contain the ‘app_config.h’ for the ModusToolbox project. Copy it to the source folder of the ModusToolbox project.
”’source/app_config.h’
・・・
/******************************************************************************
* Constants
*****************************************************************************/
// Running Modes
// 1 = DATA CAPTURE => Use this mode for collecting data and use the Data Capture Lab
// 2 = RECOGNITION => Use this mode for running a Knowledge pack from the sensor
#define DATA_CAPTURE_RUNNING_MODE 1
#define RECOGNITION_RUNNING_MODE 2
// Change the below to either DATA_CAPTURE_RUNNING_MODE (or) RECOGNITION_RUNNING_MODE
#define APPLICATION_RUNNING_MODE RECOGNITION_RUNNING_MOD
// Type of Datacapture -
// 1 = SENSOR_MOTION
// 2 = SENSOR_AUDIO
#define SENSOR_MOTION 1
#define SENSOR_AUDIO 2
// Change the below to either SENSOR_MOTION (or) SENSOR_AUDIO
#define SENSOR_SELECT_MODE SENSOR_MOTION
// Motion sensor valid sample rates
#define MOTION_SAMPLE_RATE_400HZ 400
・・・
4.5 Android smartphone program codeAndroid smartphone receives recognition results, moving smartphone screen following the movement of human hands.
When detecting BLE advertising of ‘PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit’, Android calls back ‘onScanResult’ method in ‘MyScancallback’ class. ‘onScanResult’ method checks that the BLE device name equals to ‘CapSense Button Slider.’
‘BleReceive.java’
・・・
class MyScancallback extends ScanCallback {
@Override
public void onScanResult(int callbackType, ScanResult result) {
Log.d("myApp", "scanResult/start");
try {
if (mScanned == true) return;
if (result.getDevice() == null) return;
if (result.getDevice().getName() == null) return;
Log.d("myApp", "onScanResult/" + result.getDevice().getName());
if (result.getDevice().getName().contains("CapSense Button Slider")) {
device = result.getDevice();
mScanned = true;
gattCallback = new MyGattcallback();
device.connectGatt(appcontext, false, gattCallback);
scanner.stopScan(scancallback);
}
} catch (SecurityException e) {
Log.d("myApp", "SecurityException20");
}
}
}
・・・
When receiving the recognition results from ‘PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit,’ Android calls back ‘onCharacteristicChanged’ method. ‘onCharacteristicChanged’ method decides the moving direction of the smartphone screen according to the recognition results.
‘BleReceive.java’
・・・
public void onCharacteristicChanged(BluetoothGatt gatt, BluetoothGattCharacteristic
characteristic) {
Log.d("myApp", "onCharacteristic/change");
if (characteristic.getUuid().toString().equals(TEMP_DATA)) {
Log.d("myApp", "onCharacteristic"+characteristic.getUuid().toString());
final byte[] t = characteristic.getValue();
Log.d("myApp", "length:" + t.length);
Log.d("myApp", String.format("value %x %x %x", t[0], t[1], t[2]));
// tomo stay:0 up:1 down:2 left:3 right:4
switch (t[0]){
case 0:
mListener.onClickDirection(MainActivity.BTN_NOMOVE);
break;
case 1:
mListener.onClickDirection(MainActivity.BTN_UP);
break;
case 2:
mListener.onClickDirection(MainActivity.BTN_DOWN);
break;
case 3:
mListener.onClickDirection(MainActivity.BTN_LEFT);
break;
case 4:
mListener.onClickDirection(MainActivity.BTN_RIGHT);
break;
}
//mListener.onClickDirection(MainActivity.BTN_UP);
appactivity.runOnUiThread(new Runnable() {
@Override
public void run() {
}
});
}
}
・・・
5. Capturing Sound DataSensiML DataCaptureLab captures the motion sensor (BMX160) data as the movement of human hands using the COM port.
(1) After opening the project and clicking the ‘Switch Modes’ on the center of the screen, capture mode uses by clicking the ‘Capture’ button.
(2) Set up ‘200’ at ‘Sample Rate’ item in ‘Sensor Properties’ and click ‘Next’ button.
(3) After clicking ‘Scan’ button on ‘Connection Settings,’ set up the COM port of PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit and click ‘Done’ button.
(4) Display the motion sensor (BMX160) data of IoT Sense Expansion Kit from the COM port like next.
(5) Click ‘Project Explorer’ button at the top left of the screen, and SensiML DataCaptureLab displays the capture file as follows.
SensiML DataCaptureLab labels the captured motion sensor (BMX160) data.
(1) After opening the project, click ‘Switch Modes,’ and Label Explorer mode is open after selecting the ‘Label Explorer’ button.
(2) Double-click the capture file, and SensiML DataCaptureLab displays the contents of the capture file as follows.
(3) Right-clicking on the capture data, and SensiML DataCaptureLab displays a blue and an orange line. Drag over this line to change the select area. SensiML DataCaptureLab displays the selected area at ‘Segments’ of ’File Properties’ at the top right of the screen as ‘Start’ and ‘Length.’
(4) When clicking the ‘Edit’ button of ’File Properties,’ SensiML DataCaptureLab displays ‘Select Labels.’ Select the label of this segment from ‘Horizontal.’ ’Stationary.’ or ‘Vertical.’
SensiML Analytics Studio builds a machine learning model for the gesture recognition.
(1) Set the following data after clicking the left side menu ‘Prepare Data,’ and click the ‘SAVE’ button.
- Query: myquery
- Session: labeling
- Label: Label
- Metadata: segment_uuid
- Plot: Segment
(2) Set the following data after clicking the left side menu ‘Build Mode,’ and click the ‘Optimize’ button.
- Pipeline: mypipeline
8. Generating a Knowledge Pack
SensiML Analytics Studio generates a knowledge pack from created model.
(1) Set the following data after clicking the left side menu ‘Download Mode,’ and click the ‘DOWNLOAD’ button.
- HW Platform: Infineon CY8CKIT-62S2…
- Target OS: FreeRTOS
- Format: Library
- Data Source: Sensor
- Output: Serial
Use the flash programming procedure to flash the binary to PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit and reset the board to start running program code.
Here is a video of the gesture recognition system running on PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit. Android smartphone receives the recognition results from PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit. Android smartphone screen moves according to the movement of human hands holding the PSoC™ 6 62S2 Wi-Fi BT Pioneer Kit as follows:
- When you press the CapSense button 0, and your hands move up and down, the smartphone screen scrolls up.
- When you press the CapSense button 1, and your hands move up and down, the smartphone screen scrolls down.
- When you press the CapSense button 0, and your hands move left and right, the smartphone screen scrolls left.
- When you press the CapSense button 1, and your hands move left and right, the smartphone screen scrolls right.
Comments