Over the last two years, during my school holidays, I have been working on R3-14 and personally, it has been a great learning experience as well as having an immense sense of satisfaction as I saw my idea come to life.
When the idea of making a small robot assistant popped into my mind, it was just a stab in the dark. I wanted to incorporate multiple features to learn more about electronics, 3D design & manufacture and improve my Python programming.
Before diving into the details, here is a breakdown of my project:
- Google Assistant
- SiriControl Integration
- Web App
- Face Tracking
- 433MHz RF Transmitter
- 3D Printed PLA Body
- RGB LED eyes
In the beginning, I created an aluminium prototype after designing my initial idea in Tinkercad. And then the fun began…
At first, I implemented the Jasper voice platform but with it having limitations, I switched to the Google Assistant SDK, when it was released, which had far better voice recognition and responses.
Also, I integrated SiriControl with the Google Assistant so that for all physical actions and commands, both Siri and Google can be used to call them. SiriControl is a python framework which allows the retrieval of user-initiated commands from Siri.
To add a human touch, I used RGB LED eyes which change colour according to the Google Assistant events so that the user can understand the current state such as speaking, listening, loading etc. This was done by my RGB LED library which allows smooth colour transitions. I created this library because at the time I couldn’t find one online.
For the home automation aspect , I used 433MHz transmitters and receivers to record and replay back remote codes. With the web app, this process turns out to be much cheaper than buying smart bulbs and sockets, with the Raspberry Pi acting as a hub.
Overtime, as my idea became clearer, I wanted a simpler, friendlier design which lead to the following:
I also created a web app using the lightweight web server Lighttpd, which is shown in the video, with a range of features including:
- Automatic/manual mode for remote control and face tracking
- Home automation with devices that can be toggled on and off
- Live streaming of webcam feed for remotely controlling robot
- Speaking text with various (inaccurate) accents, using eSpeak
This has been an incredible journey for me, through which I have learned a variety of skills such as the importance of design and prototyping, electronic basics and 3D design and printing. On the way, I developed the RGB LED library and the SiriControl framework, which was great fun.
It all started when I got a Raspberry Pi for my birthday a few years ago. I wanted to create a personal assistant, which would have a fun, friendly design with a human touch. Although it would not be a viable consumer product, I feel people become more emotionally attached with human-like features, such as the RGB LED eyes.
On an ending note, if you haven't worked it out, the name R3-14 stands for Raspberry Pi. Any feedback would be great and I would be happy to answer questions. Please like this project and see my personal blog for more cool stuff.
Comments