Automated Guided Vehicles (AGVs) are being used extensively for tasks which are too hazardous or repetitive for humans to accomplish. Imagine managing inventory in a huge warehouse or individually taking sensor readings in an oil rig.
When these repetitive (and sometimes dangerous) tasks are assigned to people, it becomes difficult for them to acquire vital skills that will help them advance their careers in the future. Automation of tasks like these with robots create opportunity for 'skilled labor' such as engineers, designers, maintenance workers are so much more.
As a matter of fact, this aligns with the 9th sustainable development goal of the United Nations, which is to promote sustainable industrialization and foster innovation.
In this project, two line following robots that use Particle's new mesh networking technology showcase the role of IoT in the field of industrial automation. These robots collect environmental data along their path and transmit to the cloud, which can then be visualized.
Check the end of this article for the video!
Particle Mesh devices create local wireless mesh networks that other devices can join. They help collect sensor data, exchange local messages, and share their connection to the cloud. Currently there are three mesh compatible boards.
In summary, you can create local networks using the Argon or Boron boards that act as gateways for the cloud. Xenon devices can be used as mesh end points or repeaters ( to increase the range of the mesh network). Read more from Particle's official page.
For this project, I used two Argon boards ( ideal to use an Argon and a Xenon, but I already had another Argon board with me). In order to be used in a particle mesh, I had to configure one Argon as a mesh endpoint using the particle CLI. (more on that later)
The automated guided vehicle used in this project is a 25cmx25cm line following robot. To be honest, its not an AGV that you will see in a huge factory, but the principles of operation are similar. Instead of the commonly used magnetic tapes that act as guides for industrial grade AGVs, we'll be using a white line on a black surface.
I have made the two rovers in slightly different ways (such as using motors without encoders for Rover 2) You can pick the code that closely matches the hardware configuration of your robot. Refer to the pin definitions in the code to get an idea about the wiring.
Here are some basic tips to follow when picking the required hardware for your robot. I am assuming that you have the required skills to assemble these components and get the robot working.
MotorsYour motors directly affect the performance of your robot. Pick one with too little torque, and you'll be treated to the whining sound of stalled motors while your robot desperately tries to get its wheels spinning. If you're planning to use these components for future competitions/projects, dish out some extra cash and buy a motor with quadrature encoder. An encoder provides positional information, which you can use to see how far your wheels have turned.
Motors used for this project : DC 6V 210RPM Encoder Motor Set DC Gear Motor with Mounting Bracket and Wheel. (What a mouthful) - $19 for 2 pcs.
Pros and Cons : Good gear motors with okay-ish torque. Comes with encoders, wheels, brass couplings and mounting brackets. Unbeatable for the price. (Same motors used in this tutorial). Only downside is that you'll have to use at least 70% of your PWM range to get these moving at a decent speed (while carrying the weight of your robot). This leaves a limited range for control.
Tip : If your motors appear to be stuck, check whether the magnetic disc of the encoder is rubbing against the hall effect sensor.
Motor DriverI used a VNH2SP30 motor driver from eBay which is capable of handling 14A continuous current. This is way more than the actual stall current of the motors I picked.
If you're looking for other options, the popular L293D is a good place to start. Pololu also has some good quality motor drivers that you can use for future projects as well.
MicrocontrollerAn Arduino Mega is used to drive the robot. It offers plenty of pins and is relatively cheap. The particle devices connect to the robot as an i2c slave.
Although the Argon could be used to control the robot itself, I decided to let it handle the message sending part. My hope was to show how particle devices could be integrated into existing robotic platforms to collect data.
Line Following Sensor ArrayIf you're up to it, you can create your own sensor array, or buy a cheap one off of eBay, but nothing will ever come close to the PololuQTR-8RC. The Pololu library provides plenty of useful functions to calibrate and easily get readings.
For $9, this thing will save you a lot of time.
Miscellaneous PartsAside from the basic components, I also used a Buzzer, LEDs, a buck converter (to step down the voltage from the battery), a 3S Lipo, switches, prototyping board and wires.
Designing the RobotThe frame of the robot was drawn using SolidWorks and cut from a pience of 3mm acrylic plastic. You can use any CAD tool (such as SketchUp, Fusion360) and hand the designs (usually in dxf or pdf format) over to a local laser cutting business. ( I noticed that businesses which create sign boards and panels usually have a laser cutter and spare acrylic)
I strongly recommend that you design your own base, since it depends on the hardware that you're using. (however, I have attached my designs in the git hub repo)
Design Considerations :
- If possible, place two IR sensor modules at the far ends of the robot. This will help you to identify line intersections easily. I have used two Pololu QTR-1RC modules for this.
- Use castor wheels to balance the robot.
- Make sure your wheels have good traction with the floor to prevent them from slipping.
In order to create the guided path for the robots to follow, I first applied a large matte black sticker on a piece of MDF board. I then used 3mm strips of white stickers to create the guides. ( You can make some small changes in the code and use a white arena with black lines made with insulation tape)
The code is written so that the robot takes readings in every dead end. This however, can be changed to your liking.
Here, the blue color numbers indicate the data collection points for Rover 1, and the green color numbers indicate the data collection points for Rover 2.
S1 and S2 are starting points of Rover 1 and 2.
Both rovers reach end point E, and turn around to reach their starting points again.
Design Considerations :
- It is important that you design a maze with NO LOOPS, since we will be using the left hand rule to traverse this maze.
- The white squares are used for starting and ending points.
The Pololu QTRC library contains a neat little function that allows you to calibrate your sensor panel for your arena. The included code contains this calibration process, which can be done by placing the robot in each surface and waiting for two beeps.
Line FollowingThe robots use the QTR-8RC sensor panel to detect the white line. The helper functions written for the sensor panels reads each IR sensor and updates a value with the current line position (ranging from 0-5000). It also updates an array called sensorRead which allows us to get compare readings of each senosr. A PID loop is used to maintain its position in the middle of the line when travelling. In the code, I have selected the left motor to be the master and the right motor as the slave. I have also written some helper functions that allow changing the directions/speed of motors easily.
Below is a part by part explanation of the MaintainLineMovement() function, which contain the PID loop.
char MaintainLineMovement() { //Only run inside nodetravel. Left only and Right only turns are not considered as Nodes
boolean found_left = 0; //these variables are used to keep track of intersections
boolean found_straight = 0;
boolean found_right = 0;
SetMasterPower(MasterPower); SetSlavePower(SlavePower); //apply speed to motors
QTRCRead(); //read the sensor panel
error = 2500 - (int)position; //calculate our error
int delta = error - lasterror;
int change = (error * Kp) + (delta * Kd); //+ (sumerr * 0.0028); // (error * 1/7) + (delta * 4)
lasterror = error;
sumerr += error;
MasterPower = MaxPower + change;
MasterPower = constrain(MasterPower, 0, 255); //calculate and constrain the left motor speed
SlavePower = MaxPower - change;
SlavePower = constrain(SlavePower, 0, 255); //calculate and constrain the right motor speed
The P, I and D values for your robot will have to be tuned.
The next part of this function deals with intersection detection.
Intersection DetectionGiven a maze with no loops, there are only 8 possible situations that the robot can encounter.
The remaining part of the MaintainLineMovement() function is responsible for detecting these.
if (sensorRead[6]) { // 90 degree line on left
found_left = 1;
}
if (sensorRead[7]) { // 90 degree line on right
found_right = 1;
}
if (found_left || found_right) { // if we have found a left OR right line, move a litte bit more
//run for a wee bit
SetMasterPower(150); SetSlavePower(150);
MasterENC.write(0); SlaveENC.write(0);
while (MasterENC.read() < OVERSHOOT_TICKS && SlaveENC.read() < OVERSHOOT_TICKS) {
QTRCRead();
if (sensorRead[6]) {
found_left = 1;
}
if (sensorRead[7]) {
found_right = 1;
}
}
setBrakes(255, 255);
QTRCRead();
if (SensorPanelVal == 255) { //If the entire panel reads a white floor after crossing the line, we have encountered an white box
nodeflag = 1;
return 'E';
} else if (sensorRead[0] || sensorRead[1] || sensorRead[2] || sensorRead[3] || sensorRead[4] || sensorRead[5]) { //Front line
found_straight = 1;
}
if (found_left && !found_straight && !found_right) { //intersections with just a single left or right turn is not considered as a node
setBrakes(255, 255);
delay(200);
linedegreetravel('L', TURNSPEED);
delay(200);
MasterPower = MaxPower; SlavePower = MaxPower;
} else if (found_right && !found_straight && !found_left) {
setBrakes(255, 255);
delay(200);
linedegreetravel('R', TURNSPEED);
delay(200);
MasterPower = MaxPower; SlavePower = MaxPower;
} else if (found_left && (found_straight || found_right)) {
setBrakes(255, 255);
nodeflag = 1;
return 'L';
} else if (found_straight && !found_left) {
setBrakes(255, 255);
nodeflag = 1;
return 'F';
}
} else if (SensorPanelVal == 0) { //If we are at a dead end, ask particle device to send data
setBrakes(255, 255);
nodeflag = 1;
Wire.beginTransmission(9); // transmit to Particle Device
Wire.write(node); // sends node information
Wire.endTransmission(); // stop transmitting
node++;
delay(200);
return 'B';
} else {
return 'W';
}
}
Setting Up Particle DevicesNow that we have two working rovers, we can set up our Particle boards. You can use the excellent android/iOS app provided by particle to register your new board, and even update its firmware.
Link (Android) : https://play.google.com/store/apps/details?id=io.particle.android.app&hl=en
(iOS) : https://apps.apple.com/us/app/particle-iot/id991459054
Set up is a very straightforward process, and once done, you can begin programming using the Particle Web IDE.
Since I had two Argons, I had to use the particle CLI to configure one as a mesh endpoint. Particle CLI can be downloaded from the link below.
https://community.particle.io/t/particle-cli-installer-windows/23741
After installing, simply plug in the Argon you wish to use as a mesh end point, and type,
particle mesh add <new_device> <assisting_device>
In my case, I set up one Argon as rover1 and the other as rover2. I created a network called RoverNet for rover1. Then I added rover2 to that network by,
particle mesh add rover2 rover1
Using the Web IDEThe Particle web IDE is hands down my most favorite feature. You can easily write your code, add libraries and then wirelessly upload the code to the selected board. This was a major time saver since I could easily make changes to my code without having to take anything apart!
I connected a DHT22 sensor to one board, and a gas sensor to the other. (Gas sensors run on five volts, so make sure to add a voltage divider when connecting to an analog pin).
Rover1 Code:
#include <Wire.h>
int node = -1; //initialize the node variable. We're using -1 since the robot hasn't started yet
int prev_node = -1; //Readings are taken when node != prev_node
#include "Adafruit_DHT.h"
#define DHTPIN 2 // what pin we're connected to
#define DHTTYPE DHT22 // DHT 22 (AM2302)
DHT dht(DHTPIN, DHTTYPE);
void setup() {
Wire.begin(9); //Enable i2c with address 9
Wire.onReceive(receiveEvent); // Attach a function to trigger when something is received.
dht.begin();
}
void receiveEvent(int bytes) {
node = Wire.read(); // get node value from master upon receive event
}
void loop() {
if (node!=prev_node){
float t = dht.getTempCelcius(); //get Celsius reading
Particle.publish("node1", String(node) , 60, PUBLIC); //publish node value to cloud
char eventData[10];
sprintf(eventData, "%.2f", t); //convert float value to string so we can send it
Particle.publish("temperature", eventData, 60, PUBLIC); // publish temperature to cloud
prev_node = node;
}
}
Rover2 Code:
#include <Wire.h>
int node = -1;
int prev_node = -1;
byte started = 0;
void setup() {
Wire.begin(9);
Wire.onReceive(receiveEvent); // Attach a function to trigger when something is received.
}
void receiveEvent(int bytes) {
node = Wire.read(); // read one character from the I2C
}
void loop() {
if (node!=prev_node){
int g = analogRead(A5);
Particle.publish("node2", String(node) , 60, PRIVATE);
Particle.publish("gas", String(g), 60, PRIVATE);
prev_node = node;
}
}
After doing some test runs, I was able to view the recieved data from the event log.
In order to visualize the data from my rovers, I first tried the Webhooks integration with ThingSpeak. However, ThingSpeak has a 15 second interval between messages for the free version, so I decided to use Node-Red from my computer.
Download the latest 8.x LTS version of Node.js from the official Node.js home page. It will offer you the best version for your system.
After installing node js, you should be able to install node-red using npm
npm install -g --unsafe-perm node-red
For more information, refer to the official getting started guide.
https://nodered.org/docs/getting-started/windows
After installing node-red, you will have to install two additional packages. One is node-red-dashboard and the other is node-red-contrib-particle, which contains the flows for integration with Particle Cloud.
npm install node-red-contrib-particle
npm install node-red-dashboard
Now, from your cmd, you should be able to launch node-red using,
node-red
Configuring Flows for Particle CloudThe SSE node can be used to connect to the particle cloud and subscribe to events. Drag and drop the SSE node into your main flow, and double click on it to edit its settings.
Click on the edit button next Cloud Field.
The access token can be taken from your particle web IDE > Settings (gear Icon)
If everything is set up okay, the SSE node will indicate that it has connected successfully.
The flow for my dashboard is as follows.
I have used two charts for temperature and gas sensor outputs, and two text fields to indicate the number of nodes (dead ends )
The set msg.payload node extracts the data field from the message payload received from the particle cloud.
After configuring all the flows properly, the data can be visualized. (Don't forget to deploy the flows!)
Watch the video below to see the Particle powered Rovers in action!
ConclusionThis project is meant to simulate two AGVs working in a hazardous environment. The dead ends of the white lines are the vital data collection points. Users are able to observe the gathered data from a safe distance.
I also wanted to test out the integration of particle devices with existing hardware. This same setup can be integrated with complex sensors in factories (using protocols like MODBUS etc.) to collect data and publish to the cloud.
Particle mesh is a great technology that allows you to create interconnected nodes that talk with each other and the cloud very easily. I should also mention the ability to remotely upload firmware, which is a major time saver! (Specially when you have a large number of devices)
As a final note, I would like to stress on the importance of robotics and automation for sustainable industrialization. By automating dangerous and repetitive tasks, we are creating more opportunities for people to move from low-skilled jobs to high-skilled ones. Hurrah for the robot revolution!
Comments