AI assistants are maybe one of the most powerful invention of our times and manage to help us every day with more and more tools, like audio and video processing. Therefore I decided to give my AI assistant the capability to help me in the physical world, beyond the limit of a chat imprisoned in a screen!
I built the structure for the keyestudio robot arm kit for Arduino following the tutorial, then I created some custom functions in an Arduino sketch:
shake_hands(); /*lets the robot arm introduce itself by shaking your hand with the clamp*/
agree(); /*makes the robot nod*/
disagree(); /*makes the robot move left and right to disagree*/
grab_straight(); /*makes the robot grab an object in front of it*/
Then I created a python script to chat with Gemini 1.5 Flash via API and I gave the model instructions to provide a command among
['SAY YES', 'SAY NO', 'SHAKE HANDS', 'GRAB STRAIGHT']
to send to the board controlling the robot arm via serial communication, then a full answer to send to an Arduino Uno board, which prints it on an LCD Display connected.
Finally, I added some frontend with a Flask app and an html file with CSS animations.
The final result is a captivating UI interface and a physical interacting system:
Comments
Please log in or sign up to comment.