Some everyday items need to be sorted according to visual appearance. Examples include waste items or clothing. For those with visual impairments ranging from colour blindness to reduced vision or complete blindness, these sorting tasks can be challenging and time-consuming or even impossible without assistance.
Tactile references such as Braille tags can mitigate this issue but may not exist for the item that needs to be sorted or may be impractical to fit. For example, Braille or textured tags may be attached to items of clothing but these could be difficult to find if the item is large or folded, and the tag could break off.
This project proposes an autonomous sorting bin solution. The bin uses a camera to identify the item that needs to be sorted and filters it into the appropriate bin container (e.g. paper, compost, other recycling and general waste).
The device could also be adapted to be used in the opposite way - i.e. the user requests an item and the relevant container is presented. For example, this could be used for sorting clean clothes by colour, then returning clothes of a requested colour.
Overview of operationsThe voice recognition sensor is always on. After speaking the wakeup command "Hello Robot" or setting a custom wakeup command, the device responds "Yes, I'm here", allowing the user to locate the device by sound. This can be repeated as many times as required to find the device. The voice recognition sensor volume can also be adjusted using voice commands ("Volume Up" / "Volume Down").
The AI camera scans the area above the funnel for the presence of a human hand. When a closed fist is detected in a centred position above the funnel, the device says "OK", indicating that the user should drop the item they are holding. The colour sensor then scans the item.
In this demo, socks are sorted between "light" and "dark" based on the luminance value. The corresponding bin is rotated into position below the funnel and the trapdoor is opened, dropping the sock into the bin.
The battery charge is monitored by the Notecard WiFi and uploaded to the Notehub dashboard every hour.
Bin tagsI live in the UK, so I referenced the UKAAF Standards for the braille labels on the bin tags: https://www.ukaaf.org/wp-content/uploads/2020/03/Braille-Standard-Dimensions.pdf These vary depending on where you live, so be sure to check your local standards.
As the tags used single-word labels, I used Grade 1 uncontracted braille to avoid making translation mistakes. However, Grade 2 contracted braille or single-letter labels may be more appropriate for some use cases.
There's an excellent instructable on how to 3D print braille here: https://www.instructables.com/Easy-3D-Printed-Braille-to-Add-to-Everything/
I also added tactile patterns to the front of the label to facilitate even more rapid identification of the bins. I used white PLA filament to print my tags but some visually impaired people can see colours to at least some degree, so using different filament colours for each bin might make sense for certain use cases.
Notecard Setup
- Connect the Notecard WiFi to the Notecarrier-A as shown in the photo.
- Open https://dev.blues.io/quickstart/notecard-quickstart/notecard-and-notecarrier-a/ on your computer and connect the Notecarrier to the computer's USB port using a MicroUSB cable.
- Using the In-Browser Terminal, click the USB Notecard button to connect to the Notecard.
- Give the Notecard access to your WiFi network by sending the following request in the Terminal, replacing "YourNetworkName" and "YourPassword" with the appropriate details for your network.
{"req": "card.wifi", "ssid": "YourNetworkName", "password": "YourPassword"}
- Create a new Notehub project at https://www.notehub.io/projects and make a note of the ProductUID.
- Return to the In-Browser Terminal and assign the ProductUID to your Notecard using this request, replacing "com.your-company.your-name:your_product" with your ProductUID from the previous step:
{"req":"hub.set", "product":"com.your-company.your-name:your_product"}
- Check that the connection is working using:
{"req":"hub.sync"}
- The device should now appear within the project you created on Notehub.
- Set the rate for uploads to the Notehub using the request below and adjusting the "outbound" number to the desired number of minutes between uploads:
{"req":"hub.set","mode":"periodic","outbound":60,"inbound":240}
- The voltage of the power supply connected to the Notecard can now be viewed on the device's status dashboard on Notehub.
Additional guidance can be found on the Blues website: https://dev.blues.io/quickstart/notecard-quickstart/notecard-and-notecarrier-a/
Grove Vision AI Module Setup
- Connect the camera to the Grove Vision AI V2 module using the CSI connection cable.
- Connect the Vision AI module to your computer using a USB-C cable.
- Deploy the Gesture Detection model to the Vision AI module: https://sensecraft.seeed.cc/ai/#/model/detail?id=60111&tab=public
- Check that the live preview is working as expected before disconnecting the Vision AI module.
Additional guidance can be found on the Seeed Studio wiki: https://wiki.seeedstudio.com/grove_vision_ai_v2/
XIAO ESP32-C3 Setup
- Solder the pin headers onto the microcontroller if they are not already fitted.
- In the Arduino IDE, under File -> Preferences, add the following URL to the Additional Boards Manager URLs field: https://raw.githubusercontent.com/espressif/arduino-esp32/gh-pages/package_esp32_index.json
- Under Tools -> Board -> Boards Manager, search for the "esp32" package and install it.
- Before uploading scripts to the Xiao, under Tools -> Board -> ESP32 Arduino, select "XIAO_ESP32C3" from the dropdown list and select the appropriate serial port.
- Upload the attached Arduino script to the Xiao.
Additional guidance can be found on the Seeed Studio wiki: https://wiki.seeedstudio.com/XIAO_ESP32C3_Getting_Started/#software-setup
Electronics AssemblyEach of the sensors is connected via I2C to the Xiao Grove Shield. Some of the sensors will require a double-ended Grove cable; others use a Grove to female pin header cable.
Plug the Xiao ESP32-C3 microcontroller into the Grove shield with the USB port pointing outwards.
Connect the Xiao, 16-channel servo driver and the Notecarrier to your selected battery. In this demo, all three are connected to a USB power bank.
Connect the trapdoor servo to the "servo 0" pins of the 16-channel servo driver and the turntable servo to the "servo 2" pins.
- Print all of the 3D-printed parts with around 15% infill in PLA.
- Bolt the sorting bins onto the turntable bearing.
- Screw the turntable bearing down onto a suitable base.
- Bolt the pillar onto the base.
- Assemble the trapdoor subassembly with the camera and colour sensor. Lightly glue the window onto the ledge.
- Screw or glue the trapdoor and turntable pusher onto their respective servo horns. Personally, I recommend keeping the holes in any 3D-printed parts that are attached to servo horns slightly undersize, so they grip well onto the screw threads.
- Assemble the funnel, trapdoor and pillar. Use a thin metal or plastic rod as the hinge for the trapdoor.
- Bolt the servos in place and connect the servo horns.
- Slot the tactile tags onto the front edge of each bin
This project is intended to demonstrate a base platform which can be used to build a variety of sorting devices for visually impaired users. Below are a range of suggestions to get you started on reconfiguring the device:
Alternative colour bins
To extend the colour categories, the if statement on lines 120 to 139 should be edited to evaluate the other colour variables reported by the colour and luminance sensor. For example, differentiating between red and blue socks could be achieved by evaluating the "red" variable, as shown below:
if (red > 100){
Serial.println("Red sock");
//Rotate turntable servo to 45 deg
for (uint16_t pulselen = pulse_90_deg; pulselen > pulse_45_deg; pulselen--) {
pwm.setPWM(2, 0, pulselen);
delay(5);
}
delay(500);
}
else {
Serial.println("Blue sock");
//Rotate turntable servo to 135 deg
for (uint16_t pulselen = pulse_90_deg; pulselen < pulse_135_deg; pulselen++) {
pwm.setPWM(2, 0, pulselen);
delay(5);
}
delay(500);
}
If more colour categories are added, adjustments will also need to be made to the turntable servo positions (lines 27 to 29 and lines 120 to 139).
Smaller coloured objects
In this demo, it is reasonable to assume that a sock will completely cover the clear window in the trapdoor, allowing the colour sensor to analyse its colour. To sort smaller objects, such as pills, a smaller window and narrower funnel base would be required. It would also be sensible to use the illumination LEDs on the colour sensor, as a smaller window is unlikely to allow enough light to reach the object for accurate identification.
Shape-based object identification
Some objects will not be identifiable by colour alone. For these applications, the AI Vision module can be retrained to categorise objects: https://wiki.seeedstudio.com/grove_vision_ai_v2_software_support/#-no-code-getting-started-with-sensecraft-ai-
Item retrieval
The voice recognition sensor can be programmed with up to 17 customised commands. To enable retrieval of the objects in each bin, a custom "Retrieve" command, along with a command for each bin category could be defined.
Comments