Today, we will introduce an innovative exoskeleton designed for controlling the Mercury X1 and Mercury B1 robotic arms. This exoskeleton is modeled after the structure of the human arm, allowing for intuitive and precise control.
The motivation behind developing this exoskeleton stems from the growing interest in fields such as deep learning and machine learning. These technologies enable robots to autonomously learn and perform human tasks, such as folding clothes or cooking. To achieve these capabilities, vast amounts of data must be collected, and a convenient way to control robotic arms is essential. The primary goal of designing this exoskeleton was to meet this need.
Project OverviewThe project aimed to bridge human expertise with robotic precision by creating a wearable exoskeleton interface. The exoskeleton was specifically developed to control the Mercury X1 humanoid robot, allowing for intuitive manipulation of the robotic arm. This facilitated seamless teleoperation, mimicking the operator’s arm movements in real-time and capturing critical data for further analysis.
ProductMercury X1Developed by Elephant Robotics, Mercury X1 is an advanced humanoid robot designed to handle a variety of automated tasks. With 19 DOF (7-DOF per arm), it provides exceptional flexibility and adaptability during operations. Mercury X1 features a wheeled mobile base driven by high-performance direct-drive motors, ensuring stable movement in complex environments with up to 8 hours of battery life. Equipped with a high-performance main controller system powered by NVIDIA Jetson Xavier, Mercury X1 supports complex algorithms for visual ranging, sensor fusion, localization and mapping, obstacle detection, and path planning. The mobile base is outfitted with LiDAR, ultrasonic sensors, and a 2D vision system, enabling highly perceptive environmental interaction.
This innovative exoskeleton is specifically designed for remote control and data collection. Its structure consists of 2 arm components modeled after the human anatomy and aligned with the dual-arm motion structure of the Mercury X1, allowing for precise imitation of human arm movements. Embedded in the center is an M5Stack Basic module, serving as the main control unit, providing robust computational power and flexible interfacing. The exoskeleton aims to offer a comfortable wearing experience while ensuring high-precision motion tracking and data collection to support robotic operations and machine learning training.
Initial Design Limitations
The first-generation exoskeleton was mounted on the operator’s shoulders. While this design appeared effective, it proved cumbersome during actual operation. Users had to raise their arms and elbows significantly to control the Mercury X1 properly, leading to discomfort and inefficiency.
Power and Connectivity Issues
The initial version required a direct connection to a charger, resulting in three or four cables connected to the device. This setup created a tangled and cluttered workspace.
Operator Fatigue
The physical strain of using the exoskeleton became apparent after just five minutes of operation, causing rapid fatigue among users. These drawbacks highlighted the need for substantial upgrades and led to the development of an improved version.
Our current design has resolved most of the previous pain points. The new exoskeleton features arms that are suspended in front of the chest and connected by a central rod, ensuring balance between both sides. The previous reliance on an external power supply has been replaced with a built-in rechargeable battery housed in the central M5Stack Basic module. This eliminates the need for multiple trailing chargers, significantly reducing clutter. Now, only a single cable connects the M5Stack Basic to the Mercury X1, transmitting data efficiently and simplifying the setup.
● Operating System: Ubuntu 20.04
● Programming Language: Python
● Libraries Used: pymycobot, threading, time, serial
This program functions by reading data from magnetic encoders on the exoskeleton, converting the potential values into angles that the robotic arm can interpret and execute. Despite being under 100 lines of code, this program achieves essential functionality to control the robotic arm.
def read_data():
while True:
try:
# read left robot data
ser.write(hex_array_l)
time.sleep(0.01)
count = ser.in_waiting
data = ser.read(count).hex()
tim = time.time()
#print("l:"+data+","+str(tim))
if len(data) == 84 and data[0:2] == "d5" and data[-2:] == "5d":
for i in range(7):
data_h = data[8 + i * 10: 10 + i * 10]
data_l = data[10 + i * 10: 12 + i * 10]
encode = int(data_h + data_l, 16)
# l_angle_list[i] = (encode - 2048) * 180 / 2048 if encode != 2048 else 0
if encode == 2048:
angle = 0
elif encode < 2048:
angle = -180 * (2048 - encode) / 2048
else:
angle = 180 * (encode - 2048) / 2048
l_angle_list[i] = angle
button = bin(int(data[-10: -8]))[2:].rjust(4, "0")
l_atom_list[0] = int(button[1])
l_atom_list[1] = int(button[2])
l_atom_list[2] = int(data[-6: -4], 16)
l_atom_list[3] = int(data[-4: -2], 16)
print("Left Arm Angles:", l_angle_list)
Building on this basic setup, joint parameters during specific movements can be recorded and used to train machine learning models, facilitating the collection of valuable data for subsequent algorithm development.
Looking ahead, we envision enhancing the exoskeleton with advanced features such as haptic feedback and adaptive motion algorithms to further bridge the gap between human control and robotic precision. These improvements would provide operators with a more immersive experience, offering real-time sensory feedback and allowing for more intuitive adjustments during operation. Additionally, we plan to expand the compatibility of the exoskeleton to work with a wider range of robotic platforms, enabling broader applications across various industries including manufacturing, healthcare, and research. This progression will support more extensive data collection and promote collaborative human-robot systems that can tackle increasingly complex tasks.
Comments