Hi there,
It's been two years since our launch of the Bittle robot. Last weekend, I made a horror comic clip with Bittle to celebrate Halloween. You may check out the final cut and read the following tutorial to make your Petoi robots play like a pro.
IdeaThe idea comes from many cute animal videos that dress dogs/cats in costumes.
I have both the Bittle dog and Nybble cat robots handy. Since their leg structures are very similar to the real ones, I think it would be funny to dress them up. However, the robots are about 15cm in body length, so I adjusted a lot to fit them into the tiniest costumes for real pets.
I have printed a pumpkin cat head before, but the original Thingiverse design file seems deleted. You may find some variants or design your own.
I ODMed some ultrasonic sensors with RGB LEDs. It has three WS2812 RGB LEDs in each column. They can be programmed as blinking eyes using the Adafruit NeoPixel library.
I inserted the ultrasonic sensor into the pumpkin head. I also glued a small plastic block between the pumpkin and Bittle's head to raise the pumpkin above the tall collar.
Now the main character is ready to go.
I need to add some dramatic storylines to make the simple movements less boring. I have some anatomy models for reference when designing bionic robots. They happen to fit the Halloween theme perfectly.
They also look pretty creepy. The assassin (Bittle) should get shocked when facing his mighty victim. I glued a small magnet to the assassin's hand to make the dagger detachable. The magnet's strength is tuned by a thin layer of hot glue so that the dagger will drop with a moderate shock.
I will also utilize the built-in behavior "check-around" and "backflip" to make the assassin observe the surroundings and jump backward when shocked.
SoftwareOver the past years, I've optimized the OpenCat software to make it user-friendly. I only need to uncomment the macro definition to activate the LEDs on the ultrasonic sensor.
I need to disable the distance measuring function to stop the robot from automatic reactions.
The basic Arduino code defines the instinctive motions and reactions of the robot. It's pretty encapsulated so that users don't need to worry about the tedious hardware details of a complex robot. A set of top-level string commands can control it through the serial port.
I used the Skill Composer to create and preview a new jump behavior. Below are two demos showing how the Skill Composer works.
I used a Mac to create a Python script and align all the events in order. You can read it like a regular play script.
The script is executed in the terminal by python3 hlw.py. It will send queries to all the existing serial ports in parallel, decide if a serial port is connected to a Petoi robot, and then send the tasks over the serial port. The serial port can be wired or wireless connection. For the video, I used the Bluetooth dongle to eliminate the wires.
(base) user@Macbook serialMaster % python3 hlw.py
2022-11-02 15:25:28,205 ardSerial - INFO - port[0] is /dev/cu.Bluetooth-Incoming-Port
2022-11-02 15:25:28,205 ardSerial - INFO - port[1] is /dev/cu.BittleSPP-353392
Waiting for the robot to boot up
Elapsed time: 1 seconds
Elapsed time: 1 seconds
-1
* Port cu.Bluetooth-Incoming-Port is not connected to a Petoi device!
Elapsed time: 1 seconds
['b', '\n* Start *\nBittle\nReady!\np\n']
Adding cu.BittleSPP-353392
2022-11-02 15:25:34,461 ardSerial - INFO - Connect to serial port:
2022-11-02 15:25:34,465 ardSerial - INFO - cu.BittleSPP-353392
['g', 0]
['C', [0, 50, 0, 0, 0], 0]
['ksit', 1]
['kck', 1]
Elapsed time: 4 seconds
['C', [0, 0, 127, 2, 1], 0]
['kbalance', 1]
['i', [0, -10, 8, 30, 12, 30], 0]
['C', [127, 0, 0, 0, 2], 0]
['C', [127, 0, 0, 0, 2], 1]
['K', [-4, 0, 0, 1, 1, 2, 2, 0, 0, 0, 0, 0, 0, 0, 0, 30, 30, 30, 30, 30, 30, 30, 30, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 42, 42, 42, 42, 6, 6, 6, 6, 48, 0, 0, 0, 37, 0, 0, 0, 0, 0, 0, 0, -10, -10, -10, -10, 110, 110, 110, 110, 0, 0, 0, 0, 26, 11, -11, 0, 11, 11, -22, -22, 6, 6, 8, 8, 8, 8, 52, 52, 0, 0, 0, 0], 0]
['K', [14, 0, 0, 1, 10, 29, 51, 46, 33, 30, 4, 15, 10, 32, 43, 48, 40, 29, 3, 16, 15, 35, 34, 49, 35, 30, 13, 18, 19, 44, 36, 52, 33, 17, 14, 19, 22, 38, 39, 53, 31, 16, 14, 22, 25, 30, 42, 61, 30, 17, 14, 11, 28, 21, 44, 56, 30, 21, 15, 6, 31, 8, 47, 49, 29, 40, 16, 3, 33, 13, 48, 41, 30, 37, 17, 3, 38, 16, 51, 33, 28, 34, 18, 17, 42, 20, 52, 37, 16, 32, 20, 14, 35, 23, 59, 40, 16, 31, 16, 14, 26, 26, 59, 42, 19, 30, 8, 14, 15, 28, 52, 45, 27, 30, 5, 15], 4]
['d', 5]
['I', [14, 0, 15, 0.5]]
['C', [127, 0, 0, 0, 2], 0]
['C', [127, 0, 0, 0, 2], 0]
['C', [127, 112, 0, 0, 2], 0]
['kbk', 0.5]
['kbf', 0]
['ktrF', 2]
['C', [127, 0, 0, 0, 3], 0]
2022-11-02 15:26:03,825 ardSerial - INFO - close the serial port.
2022-11-02 15:26:03,826 ardSerial - INFO - finish!
After the workflow is well-tuned, I can make Bittle repeatedly play the sequences and shoot videos from different perspectives with a single camera. Bittle is the most patient actor and only complained about a low battery once!
I shot about 4 hours to collect all media resources. It took about 20 tries to get the dagger to drop in the best direction. The post-editing took more time to search for the proper licensed BGM, clip the appropriate time window, align the soundtrack with motion, and make other tedious adjustments to achieve the best results.
The final video (at the beginning of this post) turned out cinematic. My friends loved it, and I even got likes from a few professional film directors. They want to use Nybble or Bittle as the hero in their movies.
Below is another story between the Nybble cat and Sox.
It may look like a waste of time to make a short video. However, I think it's a good demo to show how much we have improved the user interface of OpenCat robots.
- We have made the Arduino Uno-based NyBoard V1_2 and ESP32-based BiBoard V0 fully functional and open-sourced their codes on GitHub.
- We have improved the code structure and unified the Arduino motion code for Nybble, Bittle, and a coming 12 DoF model. And the Uno version and ESP32 version share the same code structure.
- Above the motion base, we have implemented high-level controllers in Python script and Tinker GUI on Mac, Windows, and Linux. It can upload firmware, calibrate the joints, and design skills. New skills developed by the Skill Composer or simulation can be sent over to the robot in real-time to test their performance in reality. Gymnastic motions such as climbing upside down and rolling on a high bar can be developed with a few clicks in less than an hour.
- We developed a smartphone app for iOS and Android. It allows programmable buttons to configure and control the robot for competitions.
- We also implemented a micro-python package (to be translated) that turns the WiFi dongle into a minimal standalone master computer. It also allows group control for multiple robots.
- We have made brief tutorials on Raspberry Pi and ROS integration with the Petoi robots.
- We keep revising the documentation center and adding tutorial videos to improve the user experience.
- We developed an introductory C++ curriculum based on Bittle.
- We also improved the components and packaging a few times since our initial launch. We now offer various kits with pre-assembled components and fully pre-assembled and tuned robots to simplify the building and coding experiences.
So far, we have sold about 10, 000 Petoi robots worldwide through our online store and Amazon. The users' feedback is vital for product quality and features iteration. Technology is only valuable when the general public other than its inventor can benefit from it. That's why I've been spending a lot of time polishing the boring bits of the released products. With the fundamental toolchain built during these two years, I hope more users can make full use of their Petoi robots. I look forward to more creative projects on our forum!
~~
Rz and the Petoi Team
Comments