Hey! I am an 18-year-old student from Latvia. I made this project for the Build2Gether V2 Contest - sports & hobbies category. Here's the original idea proposal :
Problem Identified: Difficulty Controlling Traditional Drone Controllers for People with Hand Impairments Underlying Causes: Limited Hand Dexterity: Conditions like arthritis, tremors, or paralysis can significantly limit the ability to manipulate the joysticks and buttons on a traditional drone controller. This makes it difficult or impossible for people with hand impairments to control a drone effectively. Pain and Discomfort: Using a traditional controller can be painful or uncomfortable for people with certain hand conditions. This discourages them from participating in the engaging hobby of drone flying. Exclusion from a Growing Activity: Drone use for photography, videography, and recreation is becoming increasingly popular. However, traditional controllers create a barrier for people with hand impairments, excluding them from enjoying this activity. These limitations prevent individuals with hand impairments from experiencing the joys of drone flying, hindering their ability to participate in a potentially rewarding hobby.What are you going to build to solve this problem?
Proposed solution is a smart DJI drone controller equipped with an AI camera that tracks facial movements. This controller would address the limitations of traditional controls for people with hand impairments. The AI camera would interpret facial expressions and head movements, translating them into control signals for the drone's movement. This would allow users to fly the drone by tilting their head, raising eyebrows, or pursing their lips, for example. This innovative controller would eliminate the need for precise hand movements, making drone flying accessible and enjoyable for people with hand impairments, fostering their inclusion in this exciting hobby.So what did I come up with?
The gadget turned out quite amazing. Of course, there are some improvements made, from the original proposal. Contest masters suggested a few things I could improve, so I took their suggestions in mind. First of all, I added a voice control option, so that the system is more diverse, and can be used by a wider group of disabled people. These are the voice commands you can use :
* Stop
* Drone up
* Drone down
* Drone left
* Drone right
* Drone rotate left
* Drone rotate right
* Face control on
* Face control off
* Takeoff
* Land
* Come home
* Face mode
* Button mode
* Voice mode
The
system works like this: The Raspberry Pi is the main "brain" of this device. It controls all the flight modes, inputs, outputs, and connections with the drone. Raspberry Pi is wirelessly connected to the remote of the DJI drone. The remote is connected to an Android phone (as normally would be) and then the phone is wirelessly connected to the Raspberry Pi with an open-source app. The Seeed studio sense is used to translate users' words (speech) into commands, that then get sent to the Raspberry Pi via serial connection.
Also, I added some safety features. One, I added an E-stop button, that would stop all the actions and set the drone to the return to home (RTH) mode, There are two ways to run the E-stop, 1) By saying "Stop!" 2) By pressing the E-stop button, located between both the joystick buttons. Also, I added physical buttons to control the drone, so that it can be operated manually with those buttons. By activating RTH, the drone will automatically come back to it's takeoff position and land.
As the contest masters pointed out, there might be confusion with the mode, that's currently running. To fix the issue, I added a few LEDs to the PCB, that would work as control mode indicators. When each mode is on, the corresponding LED lights up.
One important thing -> When using voice or button control: when an action is performed (pressing, for example, fly right button or saying "drone right") the drone will fly about one meter to that side for one command. For example, if you press the fly right button 3 times, the drone will fly 3 meters to theright.
How to use head controlsTo fly up, the head is tilted up, to fly down the head is tilted down, to rotate to left or right, the head is rotated left or right, to fly left or right it is pointed to left or right, to fly forward the mouth should be opened a little bit, if not then it will stop.
Demo videoSadly I broke one of my Mavics motors while testing the new AI controller, so I'm unable to make a demo video at this time, but I promise, that shortly, I will update this page and add a video. There are already replacement parts coming!! :)
InstructionsOkay, let's get started! To make things easier, I made a custom PCB. It has buttons, that control the drone, indicator lights, a Raspberry Pi connector, and a port for the seeed studio xiao esp32c3 sense chip.
First, you will need to go to the pcbway.com website and order the PCB and 3D printed parts, for the enclosure. I've attached all the necessary files at the bottom of this page. To order the PCB press quote now -> quick-order pcb, upload my Gerber file, and set the settings, as below :
When you have received your PCB, solder all the components. If this is your first time using and soldering iron, I would recommend watching this video, to see how to do it correctly and safely :
Solder them like this ( all are 220ohm resistors ) :
When that's done, solder the XIAO sense chip, with the USB C port facing up. For the large Raspberry Pi connector, you can either solder those stand-offs or cut the ribbon cable and solder each wire separately. The wire should go from the Raspberry Pi to the PCB like this.
When that's done, glue all the parts together. The PCB and Raspberry Pi case, and the lid for the PCB case.
Into the PCB enclosure like this :
Also, as you can see, you need to attach a USB camera to the case. I found an old camera, that I stripped down to have the PCB and camera sensor. You can use any that has USB.
CodingFirst, connect the seeed studio sense to a computer with a USB C cable. There are a few ways how to upload the code to it. First, install the required libraries :
- Using the Arduino IDE:
Install the ESP32 board package: Open the Arduino IDE and go to File > Preferences. In the Additional Boards Manager URLs field, add https://dl.espressif.com/package/esp32/index.json. Then, go to Tools > Board > Boards Manager and search for "ESP32". Install the latest version.
Install required libraries: Go to Sketch > Include Library > Manage Libraries. Search for "PocketSphinx" or your preferred voice recognition library and install it. - Using PlatformIO:
Install PlatformIO: Download and install the PlatformIO extension for your IDE (Visual Studio Code, Atom, etc.).
Configure PlatformIO: Open your project in PlatformIO, go to PlatformIO > Preferences, and configure the ESP32 platform.
Install libraries: Use PlatformIO's built-in library manager to install the required libraries. - Manual installation:
Download libraries: Download the library files from the respective repositories or websites.
Place libraries: Place the downloaded library files in the correct directory (usually Documents/Arduino/libraries or a similar path).
Include libraries: In your Arduino sketch, include the necessary header files using #include <library_name.h>.
Then, when you get the libraries you need to burn in the code, which you can find below this page on my GitHub page, named "SeeedStudioXIAO_mainCode.cpp"
Now that chip is ready. Let's turn our attention to the Dji control phone. When you connect it to the controller, you will need to run a host app, so that the Raspberry Pi can connect to it and control the drone. Here's the link for that app: https://github.com/dkapur17/DJIControlServer It's a public open source project, created by GitHub user "dkapur17". Download it and set it as written in that repertory. Be ready to write down the IP and PORT numbers. Your will vary! The IP is the first one and they are separated with an ": " from the port ( in this case 8080 )
Now, let's set up the Raspberry Pi! You can either connect to it via VNC, or SSH or just plug it in with HDMI and USB mouse and keyboard. First, connect it to the internet then update and upgrade it with the command prompt. Then install these libraries :
pip install mediapipe opencv-python numpy
When that's done, get the RaspberryPi_mainCode.py code from my GitHub page. Remember to change the IP and PORT in line 10 :
After that you can use this device, just remember to turn on the host Android app. You can just connect the Raspberry Pi to a power bank. It will power everything. To make everything easier, you can set it so that the Python code automatically starts when the Raspberry Pi starts up. Then it will be super simple, just plug it into electricity and it will automatically power and connect to theseeed studio board and your phone and you will be ready to go!
Enjoy!!! :))
Comments