Quarantine, lock down and isolation pose a big challenge for apartment residents with common access area, door access, and lifts as those are the only accessible path by residents to the outside world. We can stay all time indoor however, we will ran out of food and there is no way of not coming out for collecting delivery of necessities which require residents to use the common facilities. Residents were worried that they had been in the same building or walk way alley or lift with any unsuspecting Covid-19 carrier accessing the common area. Frequent cleaning and disinfect surfaces that are likely to be contaminated with pathogens and viruses. This including those that are in close proximity are part of the new norm procedures by the building management. This is adding more resources and cost constraint and also a risk for the cleaner too. Property management, with services involving cleaning, security, lift maintenance and utility supply would help in the efforts to battle the spread of Covid-19 in stratified buildings. However, how often the cleaning schedule and the coverage is questionable as a lift button or a door knob just need a touch from an infected person to transfer the virus to next person unknowingly during in between cleaning schedule. This robot is going to address the gap in between cleaning and even absence of sanitization.
Click for Infographic:
Truth about Scheduled Cleaning and the New Norm during and Post Pandemic
Addressing another aspect of protecting the high rise residents is delivery activity which will have risk of infection exposure either to the resident or the item during delivery. What if we can continue to stay at home and yet able access the common area to retrieve our favorite pizza delivery safely and secured? An avatar robot to the rescue with autonomous self sanitization and self cleaning for the in between cleaning is essential. Navigation is done using low cost Color detection, Color Sequencing and QR code for precise localization. This is where SanitizationOnDemand (S.o.D) will help which also will eliminate risk to cleaner too.
Click Infographic for Robot S.o.D Flow Stages
The building material of this SoD robot is source from easily accessible from general stores and recyclable materials in example like old bubble jet printer, packing materials, CD Rom casing, usb humidifier atomizer and many more. Items that will need to purchase are mainly the electronics as an example raspberry pi, arduino unos, jumper wires, battery holders, and aluminium t-slot, dc motors, servos, webcamera and mecanum wheels. During the development of the robot, care is taken to utilize the Arduino quick connectivity environment on wire jumper leads connectivity as much as possible and limited soldering, as this allows easier for others to duplicate and build, Care are taken as some wire connections may come loose from movement vibration and this can be mechanically supported with good casings for mechanical. Wiring conditions that are practically none existent are build and demonstration on how to build like the i2C bus connector.
Let's dive in and see how to build this S.o.D robot that will help Protect high rise building residents during this pandemic! But before this, let's checkout the videos below...Enjoy!
Overview of Robot Sanitizer On Demand with DeliverySecuredApplication 1 - Autonomous Navigation and S.O.DApplication 2 - Autonomous Navigation on QRCode and Lift Button Actuator and SanitizationApplication 3 - Radio Remote Control With FPV ModeApplication 4 - Safe and Secure DeliveryPart 1. Robot Frame, Electronics Housing & Compartment DesignThe robot frame is made entirely from T-slot 20x20 aluminium. Height is 1.2meter and width 0.56meter and lenght of 0.4meter. 4 DC motors on mecannum wheels with 110RPM at 14kgfcm are used to provide mobility. The robot has adequate pay load of upto 50kg. The current robot weight is ranging about 10-15kg for longer battery life. Battery life for the robot range about 5-7hours for both powerbank and lipo battery before the next charge.
The robot has modularity design which allows add on compartment of any height, variation of shelving and additional contraptions that are needed for sanitization application or even more functions. The compartments shelf are build from packaging materials like polystyrene which has light weird and minimal rigidity. Additional rigid material can be used like acrylic however this will be added cost on the project.
Packaging material with high density polystyrene with 1 inch wall thickness is preferable. You will need a hot wire cutter and polystyrene glue. Once a suitable dimension is isolated and cut, mounting of the box on to the T-slot is perform via a groove slot method. The polystyrene's middle side wall is cut with a 18mm wide groove for slotting into the T-slot aluminium straight pillar. This tighter groove will fix securely on T-slot which is 20mm width.
Another shelf also adjustable however the groove was slightly not tight fit and will creep down. Note the L angle to hold it in position.
2 CD ROM metal casing (after removal of all the contents and kept for other use) are use to house the Arduino, Raspberry Pi, motor drivers and all other electronics. A layered foam also from TV packaging is use to carve out the board indentation which allows tight fit and will not move around when robot is moving.
This will be the main propulsion for the robot. 360 wheels or Mecanum wheels are used for high mobility robot movement. Its orientation must be properly set to achieve the all degree mobility.
I used 2 pairs of mecanum wheels with 4 dc motors hooked up to a 30A dc motor driver. Capable for max loading of 28kg load based on mechanical wheel strength.
The MDD10A is a low cost and powerful DC motor driver capable of dual dc motor control with pwm input from Arduino. There are 2 of this motor driver being used.
Below is the full schematic of the 2 Motor drivers connection to Arduino. A radio receiver Fly Sky Fs-ia6B is used to received Radio commands from a Fly Sky radio transmitter. This allows manual control on top of the autonomous control from Raspberry Pi 3B+.
The radio receiver PWM output is connected to the Arduino input pin for pulse measurement. The pulse duration is measured and this allows Radio receiver controls signal to be used i, e for speed, turning left, turning right, move adjacent right or adjacent left and many degree of movement.
Click for Github Code
if(rf_enable==1){
// Fly Sky Radio receiver
duration1 = pulseIn(pinPWM1, HIGH,timeout); //1st Joystick up-down
chn1=map(duration1,lowPWM,highPWM,0,100);
duration2 = pulseIn(pinPWM2, HIGH,timeout); //2nd Joystick left-right
chn2=map(duration2,lowPWM,highPWM,0,100);
duration3 = pulseIn(pinPWM3, HIGH,timeout); //3rd Joystick up-down
chn3=map(duration3,lowPWM,highPWM,0,200);
duration4 = pulseIn(pinPWM4, HIGH,timeout); //4th Joystick left-right
chn4=map(duration4,lowPWM,highPWM,0,100);
duration5 = pulseIn(pinPWM5, HIGH,timeout); //5th Joystick variable trim B
chn5=map(duration5,lowPWM,highPWM,0,100);
duration6 = pulseIn(pinPWM6, HIGH,timeout); //6th Joystick variable trim C
chn6=map(duration6,lowPWM,highPWM,0,100);
}
The radio signal pulse are then use for generating the appropriate movement to the motor driver. in example as below is the Joystick 1 that product symmetrical signal at +-50 pulse duration. A hysteresis of +-2 is added to avoid jittering of the motor driver when the joystick is return to its natural position.
//Move Forward/ Backward radio control
if (chn2>52) {
Speed1=Speed+chn3+chn2-52; //Chn3 & Chn2 Radio signal to speed
Speed2=Speed+chn3+chn2-52; //Chn3 & Chn2 Radio signal to speed
Speed3=Speed+chn3+chn2-52; //Chn3 & Chn2 Radio signal to speed
Speed4=Speed+chn3+chn2-52; //Chn3 & Chn2 Radio signal to speed
//Serial.println("Forward");
usMotor_Status = CW;
motorGo(MOTOR_1, usMotor_Status, Speed1);
motorGo(MOTOR_2, usMotor_Status, Speed2);
motorGo(MOTOR_3, usMotor_Status, Speed3);
motorGo(MOTOR_4, usMotor_Status, Speed4);
}
else if (chn2<48) {
Speed1=Speed+chn3+48-chn2;
Speed2=Speed+chn3+48-chn2;
Speed3=Speed+chn3+48-chn2;
Speed4=Speed+chn3+48-chn2;
//Serial.println("Backward");
usMotor_Status = CCW;
motorGo(MOTOR_1, usMotor_Status, Speed1);
motorGo(MOTOR_2, usMotor_Status, Speed2);
motorGo(MOTOR_3, usMotor_Status, Speed3);
motorGo(MOTOR_4, usMotor_Status, Speed4);
}
In the same code, there is the i2C function signal to control the motor drivers too. Both radio and i2C works in parallel. At most of the time, radio signal for manual control will be disable to allow autonomous control by Raspberry pi.
if(i2c_enable==1) {
Serial.println("I2c enable");
/// i2c data read from raspberry Pi ///
if (i2cdata[1] ==5)
{
Stopi(i2cdata[2]); //i2c Command for Stop motor
}
else if(i2cdata[1] ==2)
{
Forwardi(i2cdata[2]);
}
else if(i2cdata[1] ==8)
{
Reversei(i2cdata[2]);
}
else if(i2cdata[1] ==10)
{
Lefti(i2cdata[2]);
}
else if(i2cdata[1] ==11)
{
Righti(i2cdata[2]);
}
else if(i2cdata[1] ==4)
{
RotateCWi(i2cdata[2]);
}
else if(i2cdata[1] ==6)
{
RotateCCWi(i2cdata[2]);
}
else if(i2cdata[1] ==7)
{
AngleForwardLeft();
}
else if(i2cdata[1] ==9)
{
AngleForwardRight();
}
else if(i2cdata[1] ==1)
{
AngleReverseLeft();
}
else if(i2cdata[1] ==3)
{
AngleReverseRight();
}
else if(i2cdata[1] ==13)
{
IncreaseSpeed();
//i2cdata[1]=0;
}
else if(i2cdata[1] ==12)
{
DecreaseSpeed();
//i2cdata[1]=0;
}
Each specific movement like forward or reverse or spin have similar functions format as below. Each Speed1/2/3/4 controls each of the 4 motor speeds, whereas the MotorGo_1/2/3/4
allows direction control from usMotor_Status
= CW (clock wise) or CCW (counter clock wise)
void Forward()
{
Speed1=Speed;
Speed2=Speed;
Speed3=Speed;
Speed4=Speed;
Serial.println("Forward");
usMotor_Status = CW;
motorGo(MOTOR_1, usMotor_Status, Speed1);
motorGo(MOTOR_2, usMotor_Status, Speed2);
motorGo(MOTOR_3, usMotor_Status, Speed3);
motorGo(MOTOR_4, usMotor_Status, Speed4);
}
An i2C receiveEvent function interrupt will get signal from Raspberry Pi and send the appropriate control signal into the motor driver for specific movement.
// function that executes whenever data is received from master
// slave device receives a transmission from a master.
void receiveEvent(int howMany) {
int i;
i2cnumbytes=howMany;
for(i=0;i<howMany;i++){
i2cdata[i]= Wire.read(); // read the offset register first.
if(howMany>1) {
}
}
if(howMany>1) { //i2c command is read from Raspberry pi smbus
if(i2cdata[0]==1){
i2c_enable=1;
}
else if(i2cdata[0]==0){
i2c_enable=0;
}
else if(i2cdata[0]==2){
rf_enable=0; //i2c command for override manual control from radio
}
else if(i2cdata[0]==3){
rf_enable=1; //i2c command for enabling manual control from radio
}
else if(i2cdata[0]==4){
readRF_enable=1;
}
}
}
As we cross from Arduino to RaspberryPi, python script is use to generate the i2C signal to Arduino. There will be another chapter to discuss on i2C bus setup later. Each python function like motor_forward()
will generate a set of i2C block data that contains direction and speed into Arduino. Arduino will pre-process it in the receiveEvent().
Click for Github Code
def motor_forward(speed = 10,waitime=0.1,write_reg = 1):
with SMBus(1) as bus:
bus.write_i2c_block_data(address, write_reg, [8,speed])
time.sleep(waitime)
def motor_reverse(speed = 10,waitime=0.1,write_reg = 1):
with SMBus(1) as bus:
bus.write_i2c_block_data(address, write_reg, [2,speed])
time.sleep(waitime)
def motor_stop(deceleration = 0,write_reg = 1):
with SMBus(1) as bus:
bus.write_i2c_block_data(address, write_reg, [5,deceleration])
if(deceleration>100):
time.sleep(2)
elif ((deceleration>10) & (deceleration<100)):
time.sleep(1)
else:
time.sleep(0)
The motor drivers, arduino and the radio receiver are housed inside an old CD ROM metal casing. A padded foam is cutout and use to secure the boards without shorting the metal casings and without use of any standoff or drilling. This is a quick get away of securing the boards without much metal works. Another low cost solution. Another green effort.
Setting up RC control for manual control is pretty straight forward. Since there are plenty of websites detailing on how to setup FPV Camera, I will not cover much over here. The video feed is received via the FPV video receiver and display out on the Smartphone as UVC USB camera. The RC control enables manual control and has an override function over the autonomous capability of the robot - well in case anything goes wrong haha.
I2C system bus at 10k Baud rate is use for reliable communication between Master RaspberryPi with all Arduino slave subsystem. RPI I2C is default at 100K Baudrate but this cause unreliable communication from time to time. It was reduce to 10K Baud Rate to allow multiple Slave subsystems. There is a forum discuss about this RPI I2C baudrate.
https://www.raspberrypi.org/forums/viewtopic.php?t=219675
RPI I2C Baudrate is change by editing the /boot/config.txt with below command:
dtparam=i2c_arm=on,i2c_arm_baudrate=10000
There is one Master and RPI being the main computer will request or receive all information from Arduino slave subsystems. Master RPI uses Python to request or receive information. Below is the Python code to access Ultrasonic proximity sensor value from Arduino Ultrasonic.
Click for Github Code
from smbus2 import SMBus
address = 0x8 #Ultrasonic slave address
read_on = 1
read_off = 0
write_reg = 0
def ultrasonic_read(regread = read_on,num=4):
with SMBus(1) as bus:
block = bus.read_i2c_block_data(address, regread, num)
return(block,num)
def ultrasonic_write(write_reg = 0,data = [0,1,2,3,4]):
with SMBus(1) as bus:
bus.write_i2c_block_data(address, write_reg, data)
Here is the Visual Studio running Python3.7 with Remote SSH with RPI3.
All Arduino slave I2C uses <Wire.h>
. Below is the standard I2C slave declaration in the Arduino code.
I2C slave declaration:
Wire.begin(0x9); // Start I2C on Address 0x09
Wire.onReceive(receiveEvent); // Receive message from RPI - write to Slave Arduino
Wire.onRequest(requestEvent); // Sending information back to the RPI
I use the same format on I2C slave ISR routines through out all Arduino slaves as below. Making coding and reused easily achieved throughout the robot development.
// function that executes whenever data is received from master
void receiveEvent(int howMany) {
int i;
i2cnumbytes=howMany;
for(i=0;i<howMany;i++){
i2cdata[i]= Wire.read(); // read the offset register first and store to array.
}
if(howMany>1) {
if(i2cdata[0]==1){
// task 1; //i2c command received from RPI for task 1
}
else if(i2cdata[0]==0){
// task 2; //i2c command received from RPI for task 2
}
}
}
Addresses for different I2C slave subsystems.
- DC motor Arduino - Address 0x0A
- Ultrasonic Arduino - Address 0x08
- Robot arm Linear servo Arduino - Address 0x09
Multiple Slave Arduino Subsystem require a bus connector. Bus connector is easily made with a small pcb and a few DIL pins. I2C bus just require SCL, SDA, and Gnd, I added 5V as this can help to power up additional I2C sensors.
Ultrasonic Proximity Obstacle detection provides additional protection to the robot. Four Ultrasonic HC-04 are used and a dedicated Arduino Uno is used to service the ultrasonic in a daisy chain service configuration. This configuration allows multiple ultrasonic without each other interfered one another sonic pings reflection. Software overhead is manageable and does not slow downs. A RGB WS2812B 8 LED is used as proximity indicator for the proximity detection. The detection data are then use to feed via i2C bus into Raspberry pi for main computer processing for robot movement obstacle avoidance.
Click for Github Code
The ultrasonic proximity arduino board allows dedicated real time processing and free up Raspberry pi from polling the obstacle proximity data. This is achieve via python script in the Raspberry pi. A python function ultrasonic_read()
send a requestEvent()
to arduino and interpret the signal as distance for each sides.
from smbus2 import SMBus
address = 0x8
read_on = 1
read_off = 0
write_reg = 0
def ultrasonic_read(regread = read_on,num=4):
with SMBus(1) as bus:
# Read a block of 16 bytes from address 80, offset 0
block = bus.read_i2c_block_data(address, regread, num)
# Returned value is a list of 16 bytes
#print(block)
return(block,num)
The robot has a delivery box which is safe and secured from prying eyes and hand and risk of exposure contamination from virus or pantogen fallout during transportation. it has an enclosure with 2 compartments build from rigid polystyrene out from TV packing materials. It has compartment with enclosure of 28cm x 20cm in one compartment and another compartment with enclosure of 28cm x 10cm. Should be reasonable for retrieving delivery items - well quite reasonable for me.
Open and close of the compartment box are controlled by Node MCU ESP8266. Assumming this scenario, the delivery man will first connect to the Wifi AP 'DeliverySecured' without password. The requester will send an IP address to the delivery man to access the web server application. He will key in the IP address given by the customer or requester.
No router is needed as this is direct Wifi Access Point. Only authorized person is allow to open and close the box. This allows secured placement of the delivery item into the box.
Below is the screen shot phone view of the web control application.
Click for Github Code
Delivery Secured Box demonstration is successful. Watch the video till the end, you will be surprised.
Part 6. Sanitizer Mist Applicator using USB HumidifierI use USB humidifier as sanitizer mist applicator. Earlier I was planning a dc motor pump with a sanitization solution injected thru a constricting nozzle. However this spraying method is too harsh and rough and will be wasting a lot of sanitizing solutions due to large droplets spraying everywhere (mainly fallout to the floor) instead on the targeting door knob or handles. This USB humidifier is easily purchased of the shelf from any stores or beauty shop.
According to the CDC, the only way to truly kill viruses is to use a bleach dilution solution or a 70%-or-higher alcohol-based solution. Disinfecting and cleaning are not one and the same. Cleaning helps remove germs and dirt from a surface, but disinfecting actually kills germs. The solution can be easily absorbed by the wick and push thru the ultrasonic mist plate and injects a 15-20cm height of atomized sanitizing liquid in 30 degree cone area.
The generic USB humidifier ultrasonic is activated using finger touch as switch. We can purchase a dedicated ultrasonic mist component but that will add to the cost and we do not have the mechanical housing nor the liquid applicator. Using a general USB humidifier just helps to simplified development. Now is time to dissect a $5 USB humidifier! This humidifier has 50-80 ML per hour atomizer. Suitable for our robot sanitization application.
Carefully remove the screws and gently remove the circuit board as not to yank out the delicate wires. Normally the circuit board is hold to position by the USB connector even with the screws removed.
I modified the circuit board to allow activation from Arduino. After much tweaking, a small soldering activity with 3 pad points of 'gnd', '5V' and 'touch' are wire up to connect to Arduino. Finding '5V' and 'Gnd' is easy, just trace the path from the USB pinout. For the 'touch' pad, we will need to explore a bit from the finger touch IC.
On the thin circular PCBs, we can easily make up what are those components. There are 2 ICs and couple of capacitance and resistors.. Found one IC with marking which is a 2-Channel Touch Control IC - SGL8022K. The other unlabeled IC is the MHz generator pulse for the ultrasonic. We can skip this IC.
The datasheet https://datasheet.lcsc.com/szlcsc/1912111437_Sigma-Micro-SGL8022K_C190038.pdf helps in the identification of the signal pinout.
Here is the simulated finger touch pulses using Arduino GPIO pulses. Changing from high impedance to low impedance and series of high and low simulated the finger touch pulses.
void mist_on(){
pinMode(mister,INPUT); //set Input to Mister gpio as high impedance input
delay(3000);
pinMode(mister,OUTPUT); //set Output to Mister gpio as low impedance output
digitalWrite(mister,LOW);
delay(3000);
for(int i=0;i<=100;i++){
digitalWrite(mister,HIGH); //generate simulated finger touch pulses.
delay(1);
digitalWrite(mister,LOW);
delay(1);
}
digitalWrite(mister,HIGH); //set high to remain last touch pulses.
}
A i2c command is set to call this function from Raspberry pi via Python smbus2
if(i2cdata[0]==7){ // i2c command to turn on Mist_on
mist_on();
i2cdata[0]=0; //clear data
}
The connection is successful and the USB humidifier is now controllable by Arduino. Next to add in the mounting to the Linear Servo.
Part 7. Linear Servo Motor Hack up from a very old bubble jet printer.In order to allow Lift button being pressed this is done using a Linear servo motor actuator. It has a linear servo actuator 2 axis build from a very old bubble jet printer motor and linear shaft assembly. Cost from this recycle printer is zero minus the wires and arduino and motor driver.
Do take note a bubble jet printer is mean to be horizontal transitional move, so this hack is pushing it to the other extreme of vertical movement against gravity. Power going up is higher then power going down. There is no gearing in the original motor, which earlier I was worried that the carrier with extra load will cause slow creeping issue vertically down.
I found a long ribbon cable ( somehow the original printer cable is missing, guess it has being of good used in some other projects) and attached to the encoder sensor. Also added a L bracket as a mount for vertical assembly to the main robot.
To my surprise once the motor is energized and in brake mode, it does able to hold some load in example a medium size servo even with motor totally power off or without any feedback loop.
I continue to hack up the encoder sensor. It has a neat Agilent 9986 part number that is no longer in manufactured. This printer is very old!
After spend hours searching in Google and Arduino forum https://forum.arduino.cc/index.php?topic=17892.0 although not manage to find the exact part pin out but I found a gallore of encoding information in https://playground.arduino.cc/Main/RotaryEncoders/ that I can use later.
Somehow I based on other part number with same pin configuration to guesstimate it usage and found this http://pdf.datasheetcatalog.com/datasheet2/a/0a46za47kk0rg2x0izood71i9apy.pdf
This pdf point to this block diagram an the clue is as along as I measured 2.5K Ohm resistance in 2 channels, I will be able to identify all the channels. I am getting close!
I solder 4 wires into the optical sensor for pin 1, 2, 3 and 4 and added 1K resistor to the LED. And tada, I am able to get readings from the neat optical sensor from 1st run. The optical plastic slit is a slightly worn out but suprisingly still can be used without jittering.
Below is the arduino code serial monitor with the optical sensor attached as Interrupt pin 2.
With the optical encoder sensor working, now I can add in the motor feedback positioning control. Once this sensor is sorted out it is times to hook up all the remaining circuits servo and dc motor driver.
The added feedback loop control the motor is able to hold the carrier at the target position very well. Even with heavier load interruption and forcibly push or pull the carrier, the feedback loop positional control the carrier corrects itself and continue to stay puts without budging. After several times of testing, both power (via PWM) and feedback loop positional control are tweak to allow minimal oscillations on the feedback loop with good holding load and making it usable and fast.
Below are the charts of the feedback loop that allows very precise positioning with +- 10 error from the encoder value. After some motor PWM tweaking on the motor power drive, it has a good positioning feedback loop. Below are few tests from start to mid session and single step which shows minimal oscillations to the target position. Stepping down has additional oscillation and slightly slower but not noticeable due to gravity on the actuator.
Here is a quick video of the linear servo.
This is a swing arm built simply from a servo. Attached a servo horn onto the aluminium L and voila a precise swing arm for robotic control.
Once the assembly is complete, next is to add in the I2C slave driver routine so that it can be hooked up to RaspberryPi Master I2C system bus. For this robot arm linear actuator, I will used address 0x9. Similar to other subsystem, this I2C routine will help to service any interrupts from I2C bus request which allows RaspberryPi to gain control.
I2C slave declaration:
Wire.begin(0x9); // Start I2C on Address 0x09
Wire.onReceive(receiveEvent); // Receive message from RPI - write to Slave Arduino
Wire.onRequest(requestEvent); // Sending information back to the RPI
I use the same format on I2C slave ISR routines as below cater for robotarm:
// function that executes whenever data is received from master
// slave device receives a transmission from a master.
void receiveEvent(int howMany) {
int i;
i2cnumbytes=howMany;
for(i=0;i<howMany;i++){
i2cdata[i]= Wire.read(); // read the offset register first.
//int c = Wire.read();
if(howMany>1) {
Serial.print(i2cdata[i]);
Serial.print(",");
}
}
if(howMany>1) {
Serial.print(" : WriteReg=[");
Serial.print(i2cdata[0]);
Serial.print("] :Num:");
Serial.println(howMany);
if(i2cdata[0]==4){
// serial_enable=1; // i2c command to enable ARM from RPI
i2c_arm_enable = 1;
}
else if(i2cdata[0]==0){ // i2c command to disable ARM from RPI
// serial_enable=0;
i2c_arm_enable = 0;
}
}
}
For
Master RPI Python script it will be as below:
from smbus2 import SMBus
import time
address = 0x9
read_on = 1
read_off = 0
write_reg = 4 # 2 = ledpwm accessible for writing
# 3 = rgbled accessible for writing
# 4 = robot arm accessible for writing
def robotarm(data = [100]): //i2c command write to arduino with the position
with SMBus(1) as bus:
# Write a byte to address 80, offset 0
bus.write_i2c_block_data(address, write_reg, data)
time.sleep(0.01)
Part 8: Color Sequence Detection for Robotic Control Navigation IndoorIndoor navigation for destination can be accomplished using camera and with humble raspberry pi and powerful OpenCV. Earlier I was thinking about using LIDAR sensor but to my surprise it is quite costly and will easily surpass the whole project cost. So I end up using a webcamera as the navigation sensor. Simple method like color detection accomplished using image processing at times having issue with existent of other colors around the indoor environment. I.e Orange or Blue are plenty in a room or walkway. Detect for color Orange however, the table also has similar color (shown as blue contours) which can be difficult for indoor tracking. I will use 2 stage for navigation. First is color detection with X, Y coordinate for navigation, and 2nd stage is Color Sequence detection, this allows false triggering in the real world environment.
However by carefully programmed as color coded sequence this allows more precise navigation and avoid error along the way.This is accomplished on OpenCV. First a non glossy colored paper is being identified its proper Hue, Saturation and Value. Below is the identified color HSV. Since the robot will carries its own powerful LED lighting, the HSV is pretty much within range.
if whatcolor == pink: # detect pink color
lowerBound = np.array([112, 164, 19])
upperBound = np.array([176, 255, 255])
#print("Pink")
elif whatcolor == yellow: # detect yellow color
lowerBound = np.array([24, 192, 44])
upperBound = np.array([35, 255, 255])
#print("Yellow")
elif whatcolor==green: # detect green color
lowerBound = np.array([36, 135, 128])
upperBound = np.array([61, 255, 255])
#print("Green")
elif whatcolor==blue: # detect blue color
#lowerBound = np.array([62, 176, 125]) #daytime
lowerBound = np.array([35, 106, 134]) #nighttime
upperBound = np.array([109, 255, 255])
#print("Blue")
elif whatcolor==orange: # detect orange color
#lowerBound = np.array([12, 192, 174]) # daytime
lowerBound = np.array([10, 104, 174]) # nighttime
upperBound = np.array([24, 255, 255])
#print("Orange")
Once all is set, by detecting orange then blue in sequence using Y1 and Y2 coordinates this allows matching code. A match range within 60pixels is used as a tolerance between the 1st detection box of orange which is (x1, y1) and 2nd detection of box of blue which is (x2, y2) with separation from each boxes as height h1. As long as y2-y1 is within 60 pixels it is regarded as a match colored. This is using simple cartesian arithmetic.
Example of Color Sequence Detection on OpenCV and other interference of colors.
Part 9: QR Code and OCR Detection Enhancement for Indoor NavigationColor sequence allows distanced indoor navigation, however to precisely identify exact location in example door 1, door 2, exit or lift, I use QR Code detection. QR Code require higher magnification of the image before a successful readout on QRcode is possible. Thus I use 3 stage navigation and localization as below. This eliminates the use of higher resolution image or even telescopic lens.
Color -> Color sequence -> QR Code or OCR for Lift Number
Far
->
Near
->
Very Near
Since color detection can be easily identified from a distance, a color plate together with QR code are paste on the specific door side by side. The robot will approach to the intended color using color detection follow by color sequence. As it gets closer and closer its visual frame become bigger and other signage in example QR code will be distinguishable. And once it reach near enough to an image that have enough pixel for QR code reading, it will activate the QR code detection and recognize this as a door 1 or door 2 or exit. For specific location in example lift or elevator, this is further assisted using Optical Character Recognition which identified the lift numbers and allows the robot arm to activate which floor to enter.
Using the same camera, I use the python library pyzbar and with a few lines of code, able to detect the QR code label. Raspberry pi 3B does not have the fast processing as compare to a desktop CPU to perform indoor scene training for machine learning, so using whatever available library will help to provide higher localization for indoor navigation with its processing power on usable frame rate.
QRCode printout is easily generated using online QR generator and printed out at 700pixels to have clear and sharp distinction. Below is the readqr() function that can be called by other main code for image analysis.
def readqr():
global qx,qy,qw,qh,barcodeData
barcodes = pyzbar.decode(gray)
for barcode in barcodes:
(x, y, w, h) = barcode.rect
qx=x+round(w/2)
qy=y+round(h/2)
cv2.rectangle(img, (x, y), (x + w, y + h), (0, 0, 255), 2)
cv2.rectangle(img, (qx-5, qy-5), (qx+5, qy+5), (0, 255, 0), 2)
barcodeData = barcode.data.decode("utf-8")
barcodeType = barcode.type
text = "{} ({})".format(barcodeData, barcodeType)
cv2.putText(img, text, (x, y - 10), cv2.FONT_HERSHEY_SIMPLEX,0.5, (0, 0, 255),2)
return(qx,qy,barcodeData)
Below are the code for testing QR Code and OCR numeric recognition.
if (__name__=="__main__"):
camera_setup()
while 1:
camerasearch(graphic_on)
readqr()
showInMovedWindow('QRCode',img,10,50,graphic_on)
k=cv2.waitKey(1)
if k == ord('x'):
cv2.destroyWindow("QRCode")
break
elif k != -1 and k!=ord('s'):
print(chr(k))
charsearch(k)
showInMovedWindow('Char',img,710,50,graphic_on)
Robot also attempts to accomplish autonomous lift travel inside the apartment lifts which require number pressing for going to different levels. I use pytesseract library to accomplished this OCR detection. Using the default pytesseract does not able to recognize single digit, so after much experiment able to recognized single digit using '--psm 13'.
boxes= pytesseract.image_to_boxes(gray,config='--psm 13')
A simulated lift number OCR detection is carried out using printed out lift number. This allows python code navigate even thru apartment lift in example of the robot is requested to move to level 2, it will able to identify numeric 2 of the lift number (after doing the preprocessing of double digit i.e 12, or 22 number) and activate the robot arm.
The code is called to detect only numeric '2'. Note the 12 and 2 are detected too which require pre processing single/double digit.
Somehow Installing the pytesseract in Raspberry pi, will have the older tesseract verson of 3.04.01. And this will not work with the tesseract python settings that we did from Windows. We will need to get and install directly from tesseract github. There is an instruction to do so. Also do remember to install the training data else will encountered language data error. Hmm.. the downside running this pytesseract on RPI is the fps, which takes about 5 sec for a text to be recognized.
https://tesseract-ocr.github.io/tessdoc/Compiling-%E2%80%93-GitInstallation
Below is the video detection based on QR Code and OCR for lift number. This python code will be called as functions to the main robot application.
With this 3 phase navigation and localization is completed, I can now integrate all the components and codes to the main robot system to navigate the robot indoor of the apartment building and allows sanitary on demand application or S.O.D Application. All the codes are written in modularity format allows easy integration of various robots functions to create various features and applications.
Code Functionality OverviewAutonomous code:main_autonomousA.py
Fully Autonomous 5 Stages for Sanitizing on Demand on Door Knobs. Can be override with radio control.
main_autonomousB.py
Fully Autonomous 6 Stages for Sanitizing on Demand on Lift Buttons. Can be override with radio control.
-----------------------------------------------------------------------------------------------------------
Arduino Slave Modules receive i2C command from Raspberry Pi:360Wheel_Radio_Light_i2C_ver8.ino
Arduino Uno code for 4 DC motor driver controls and Radio signal decoding.
robotarm_i2c_mist_pump_10.ino
Arduino Uno code for linear servo, angular arm servo and Mist driver.
ultrasonic_i2C_rgb_pump.ver5.ino
Arduino Uno code for ultrasonic proximity detection, LED driver, water pump driver, and i2C bus powered.
DeliveryBox_WiFiAccessPoint6.ino
Arduino NodeMCU code for SecuredDelivery Box with web apps.
------------------------------------------------------------------------------------------------------------
Raspberry Pi Python Code to control the Arduino Slaves:i2c_motorcontrol.py
Use smbus to control motor's speed, duration. Also read Radio switches whether to override between autonomous or manual control. Return Radio switches.
i2cread_ultrasonic.py
Use smbus to read ultrasonic proximity sensing data. Return proximity sensing data for obstacle avoidance.
i2c_robotarm.py
Use smbus to control linear servo, angular servo and mist activitation. Return servo position.
i2c_lightpump.py
Use smbus to control LED brightness and water pump flow rate. Return nil.
Raspberry Pi Python Code to perform image processing:motiondetection5.py
Use to launch the robot upon motion detection activated. Return successful flag of detected movement.
navigationlocation5.py
Use QR Code as near-navigation and Optical Character Recognition for Lift button identification and control. Return x, y coordinates including area, height and width of the detected QRcode and Lift Number position.
colorsequencedetect9.py
Use Color and Color Sequence for Far and Near image for navigation. Return x, y coordinates including area, height and width of the detected image position.
Summary:The Sanitization-On-Demand Robot with DeliverySecured capability was built and tested on numerous test area. This allows it to achieve multiple application of sanitizing in the apartment common area, covering door access, lift buttons and area that require frequent or in between sanitization and to complement the stratified building management cleaning resources. It's DeliverySecured Robot acts as an avatar for residents to reduce external contact during delivery and transportation of items within the building common area. Combination of these features makes this SoD Robot with DeliverySecured a good complement during this pandemic time and in the new norm.
Comments