I built a cloud-controlled delta robot to draw coded shapes. Since my drawing skills are poor, I used servo motors, an Arduino Uno, and computer vision to track the robot’s movements.
Why Hybrid?Because it uses solid links as well as flexible cables.
Circuit Schematic/ DiagramThe circuit of this robot has a left servo motor, a right servo motor, the ground, power and the data or PWM signal wires. All of them connect to the microcontroller on the Arduino Uno prototyping board.
CAD ModelI have added two servo motors, attached two solid links and a cable or thread betwen the links. Both servo motors are controlled by the Arduino Uno R3 prototyping board.
My Old Delta RobotI have previously built a delta robot with all solid links, and that was horizontal, but this one is vertical and has flexible cables.
Use Of Colored BallsThese balls are attached at the joints of this robot, this helps our computer vision software to easily distinguish and track each joint. The computer vision detects and draws a boundry box around the object detected. This helps in recognizing the movement of the end effector of the robot, and to delect the shape drawn by our robot. The computer vision can later be used to manipulate the robot as well.
Arduino IoT CloudI also integrated IoT control, allowing me to move the robot remotely using sliders on an Arduino Cloud dashboard. I have created a dashboard on arduino cloud. there I have created two sliders for each motor. By changing the value on the slider, the respective motor moves.the slider is directly proportional to the motor. I can control this robot from anywhere.
Total Operational Area Of the Robot (Workspace)This is not a robot-arm kind of robot, so it has a limited area of operation or workspace. The workspace refers to the total area within which the robot's end-effector can operate or reach. We have two motors here; if any of them or both of them reach their extreme points, then it will be the limit of our robot. So we have to work within the limits of these points.
This green colored line is our robot's workspace. Within this area, we can draw anything and reach any point; it all boils down to our software capability. The robot's workspace was mapped out, and I used inverse kinematics to convert shape coordinates into motor angles.
Controlling With Custom Build Web ServerI developed a web interface where users could draw a shape on a phone, which was then replicated by the robot in real life.
I have a mobile phone, on which I'll open the webpage. I'll draw the shape. The coordinates of the shape will be sent to the computer, which has a python web server running on it. The web server will then send the commands to the Arduino Uno prototyping board serially. The microcontroller will calculate the proper angles of the servos and send them to the servos. Servos will move and draw the same shape.
Drawing Pre-Programmed Coded ShapesI have written code that generated the coordinates for these shapes.
These coordinates are then converted to the angles for both of the servo motors.
This process is also called inverse kinematics.
Future ApplicationsBy adding a third motor, it can be transformed into a 3D delta robot capable of industrial applications like pick-and-place tasks and 3D printing.
Comments
Please log in or sign up to comment.