In most people's minds, robotic arms are typically used in industrial fields to perform repetitive tasks, assisting and replacing humans. However, robotic arms are not limited to just this; they can also become companion robots, providing us with more diverse interactive experiences.
Today, I want to explore some unique functions of robotic arms. By integrating facial expression recognition technology, we can enable robotic arms to perceive our emotional changes. When we are happy, the robotic arm can dance joyfully with us; when we are sad, it can come over to comfort us with a warm touch. This interaction based on facial feedback can make the robotic arm a better companion for us.
In the following sections, we will detail the working principles, technical implementation, and application scenarios of this system, showcasing the robotic arm. The article will be divided into three parts: an introduction to the robotic arm, a discussion of the technical points, and the project implementation. First, I will introduce the robotic arm I used.
Robotic ArmmyCobot 320 M5StackThe myCobot 320 M5Stack/Raspberry Pi is a 6-DOF collaborative robotic arm that has become a highlight in the field due to its unique design and high-precision servo motors. This robotic arm boasts a maximum working radius of 350mm and a maximum payload capacity of 1000g at the end effector, making it suitable for a wide range of applications. The myCobot 320 M5Stack/Raspberry Pi not only supports flexible visual development applications but also offers in-depth mechanical motion analysis, providing users with 12 standard 24V industrial IO interfaces to meet various development needs.
It is highly open and compatible with most mainstream operating systems and programming languages, including Python and ROS, providing developers with great flexibility and freedom. Whether in education, research and development, or industrial applications, the myCobot 320 can offer strong support, making innovation and application development more convenient and efficient.
Camera ModuleThe camera module compatible with the myCobot 320 M5 can be mounted at the end effector of the robotic arm. By communicating through a USB data cable, the camera can capture real-time images from the robotic arm's end effector, enabling the recognition of facial expressions.
pymycobot is a Python API designed for serial communication and control of the mycobot robotic arm. This library facilitates developers in controlling the mycobot robotic arm using the Python language. It provides a series of functions and commands that allow users to programmatically control the movements and behaviors of the robotic arm. For instance, users can use this library to retrieve the angles of the robotic arm, send angle commands to control its movement, or obtain and send coordinate information of the robotic arm. The only standard for using this library is that it must be used with the mycobot series of robotic arms, as it is specifically adapted for mycobot.
DeepFace is a powerful Python library for facial recognition and facial attribute analysis. It is based on multiple deep learning models such as VGG-Face, Google FaceNet, OpenFace, Facebook DeepFace, DeepID, and Dlib, providing functions like facial verification, facial detection, and facial attribute analysis (such as gender, age, race, and emotion). DeepFace simplifies complex facial recognition and analysis tasks through a straightforward interface, making it widely used in security systems, user authentication, and intelligent interactions.
I divided the project into two main functionalities:
1. Emotion Detection and Recognition: This primarily handles facial emotion recognition, capable of returning information about the current emotional state of a face—whether it is neutral, happy, sad, etc.
2. Robotic Arm Control: The main function of this part is to set the robotic arm's movements, such as coordinate control and angle control.
Implementation of Emotion Recognition FunctionalityThere are many sophisticated methods for facial emotion recognition available on GitHub. However, if one wants to develop a facial emotion recognition feature from scratch, it can be divided into four steps:
1. Data Collection and Preprocessing
2. Model Selection and Training
3. Model Optimization and Testing
4. Deployment and Application
Starting from zero involves many steps. If we are simply using existing solutions without needing them for specific application scenarios, we can choose a pre-trained detection model and use it directly. Currently, options include OpenCV, FER (Facial Expression Recognition), DeepFace, Microsoft Azure Face API, etc.
For this project, we will use DeepFace for emotion recognition.
Environment Setup
Firstly, it is necessary to set up the environment. Ensure that the OpenCV version is not too low, as this can affect usage.
pip install deepface
pip install opencv-python
DeepFace offers many functionalities such as age detection, gender detection, and emotion detection. For this project, we will use emotion detection, specifically the "facial_expression_model_weights.h5" model, which will be automatically downloaded during the process.
Example Usage
Here's a simple introduction to the usage of the functionalities.
import os
import cv2
from deepface import DeepFace
# Read the image
image = cv2.imread(image_path)
# Analyze facial expressions in the image
results = DeepFace.analyze(image, actions=['emotion'], enforce_detection=False)
print(results)
Example Output:
[{'emotion': {'angry': 81.24255537986755, 'disgust': 16.530486941337585, 'fear': 1.6193315386772156, 'happy': 6.932554015293135e-05, 'sad': 0.4116043448448181, 'surprise': 0.1861470052972436, 'neutral': 0.009808379400055856}, 'dominant_emotion': 'angry', 'region': {'x': 136, 'y': 65, 'w': 124, 'h': 124, 'left_eye': None, 'right_eye': None}, 'face_confidence': 0.9}]
This output shows the probabilities of various emotions and identifies the dominant emotion, along with the region of the detected face.
From the returned data, we can see that the 'angry' emotion accounts for 81%, indicating that the current expression is angry. This is just the detection of a single image. To verify its accuracy, we should test it with several images.
For continuous detection, we need to activate the camera to continuously analyze the video feed. This means processing a series of images as a video stream.
Here's how you can achieve real-time emotion detection using a webcam:
import cv2
from deepface import DeepFace
cap = cv2.VideoCapture(0)
if not cap.isOpened():
print("Error: Could not open webcam.")
exit()
while True:
ret, frame = cap.read()
if not ret:
break
try:
result = DeepFace.analyze(frame, actions=['emotion'], enforce_detection=False)
emotion_info = result[0]['emotion']
dominant_emotion = result[0]['dominant_emotion']
emotion_probability = emotion_info[dominant_emotion]
text = f'{dominant_emotion}: {emotion_probability:.2f}%'
cv2.putText(frame, text, (50, 50), cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 255, 0), 2, cv2.LINE_AA)
cv2.imshow('Emotion Detection', frame)
except Exception as e:
print(f"Error analyzing frame: {e}")
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cap.release()
cv2.destroyAllWindows()
In this section, I will introduce how to control the mycobot 320 robotic arm using the pymycobot library. First, let's set up the environment:
pip install pymycobot
Below are some common control methods for the mycobot robotic arm:
Angle Control
Method: `send_angles(degrees, speed)`
Function: Send angles to all joints of the robotic arm.
Parameters:
● `degrees` (List[float]): Contains the angles for all joints. For a six-axis robot, the list length is 6, and for a four-axis robot, the length is 4. Example: `[20, 20, 20, 20, 20, 20]`
● `speed` (int): Represents the speed of the robotic arm's movement, ranging from 0 to 100.
Example:
from pymycobot.mycobot import MyCobot
mc = MyCobot("com10", 115200)
mc.send_angles([0, 0, 0, 0, 0, 0], 100)
Angle control is suitable for adjusting each joint's angle for fixed position control, but it has limited applications. Next, we will look at coordinate control, which is more precise and commonly used.
Coordinate Control
Method: `send_coords(coords, speed, mode)`
Function: Send overall coordinates and posture, allowing the robotic arm's end-effector to move from the original point to the specified point.
Parameters:
● `coords` (List[float]):
● Six-axis: Coordinates in the form `[x, y, z, rx, ry, rz]`, length of 6.
● Four-axis: Coordinates in the form `[x, y, z, rx]`, length of 4.
● `speed` (int): Represents the speed of the robotic arm's movement, ranging from 0 to 100.
● `mode` (int): Values can be 0 or 1.
● `0`: The path of the robotic arm's end-effector is non-linear, i.e., random planning, moving to the specified point while maintaining the prescribed posture.
● `1`: The path of the robotic arm's end-effector is linear, i.e., intelligent planning to move in a straight line to the specified point.
Example:
from pymycobot.mycobot import MyCobot
mc = MyCobot("com10", 115200)
mc.send_coords([100, 20, 30, -50, 60, -100], 100, 1)
Creating a Robot Arm Controller Class
To enhance readability and maintainability, create a `RobotArmController` class for easy invocation and modification, and predefine the corresponding actions.
Class Definition:
from pymycobot.mycobot import MyCobot
class RobotArmController:
def __init__(self, port):
# Initialize connection
self.mc = MyCobot(port, 115200)
self.init_pose = [0.96, 86.22, -98.26, 10.54, 86.92, -2.37]
self.coords = [-40, -92.5, 392.7, -92.19, -1.91, -94.14]
self.speed = 60
self.mode = 0
def SadAction(self):
# Define the actions for when the detected emotion is sad
self.mc.send_angles([10, 20, 30, 40, 50, 60], self.speed)
# Add more actions as needed
def HappyAction(self):
# Define the actions for when the detected emotion is happy
self.mc.send_coords([100, 20, 30, -50, 60, -100], self.speed, self.mode)
# Add more actions as needed
Example Usage:# Initialize the controller
robot_arm = RobotArmController("com10")
# Perform actions based on detected emotions
emotion = 'happy' # This would come from the emotion detection logic
if emotion == 'happy':
robot_arm.HappyAction()
elif emotion == 'sad':
robot_arm.SadAction()
This setup allows for clear, organized, and easily modifiable control of the mycobot robotic arm, enabling it to respond appropriately to detected emotions.
When I'm angry
When I'm happy
As technology advances at an ever-increasing pace, we can expect to see intelligent humanoid robots in the future. Coupled with AI models like ChatGPT, these robots might one day assist people with their troubles and even serve as therapists for those with psychological issues. The future of technology holds exciting possibilities, and it will be fascinating to see how these innovations evolve and improve our lives.
Comments