Due to COVID -19 virus pandemic we are forced to find solutions that will help us in Sanitizing the affected areas as due to unavailability of vaccine till now we have to adopt Sanitizing as a Solution and it is risky to employing Human beings to perform sanitizing/disinfecting so here I am building Bi-copter based 4 wheeled robot which when needed can hover or move like a normal robot . The reason for which i consider this robot design is that because of its low cost, high maneuverability & easy to design while mainly normal robots are all only wheeled robot and it can be infected while sanitizing affected area in other words it can be more like a COVID-19 transmitting machine rather than disinfecting machine. So i combined Bi-copter and wheeled robot named as
"BI-COBOT".
The Design:The robot uses 4 wheeled chasis design ( It will be 3D printed as it will be light-weighted & Strong) which is also designed to fit the two 1000 Kv Brushless Motor in it ( two Brushless Motor can also be installed underneath the Bi-copter if more thrust is needed ) and it will be controlled by the Flight Controller and Jetson Nano for Autonomous Mode & Radio Controller for Manual operation. The wheels are only for the emergency purpose and the whole operation will be done in flight mode as it will enable it to cover the whole ground area as well as walls of the room which was not possible by a ground based robot.
The working of the robot is divided into two parts:
- Working Of The BI-COBOT - The BI-COBOT will work similarly like a quadcopter. The Radio Controller has Throttle for Upward-Downward motion, Yaw for circular motion at the current position, Roll for Left and Right motion & Pitch for Forward and Backward Motion. However in BI-COBOT the Roll & Pitch movement will be controlled by the orientation/angle of the two Brushless motor via dedicated two Servo Motors. This all can be setup by the KK 2.1.5 Flight Controller Software (It also has configurations for the 4 BLDC Motor in Bi-copter configuration if needed for more thrust). The toggle switch on the radio will be used for switching ON-OFF the UV Light and the pilot will move around the Bi-copter as per requirement. There will be 6 Ultrasonic Sensor will be mounted in front, back, upward, downward, right, and in left side to prevent the bot from colliding with the walls and other objects. This is for Manual Mode.
- The BI-COBOT has also Linear Motion capability It works in the following way; when both BLDC motor is tilted in forward direction via Servo1 it will enable the Forward Linear Motion & when they are tilted in backward direction via Servo2 it will results in backward motion , a third Servo Motor is also there for the Right-Left motion of Two Front wheels, this will led the BI-COBOT a linear motion just like a car and now an interesting if we change the orientation of the propellers it will also enable the BI-COBOT to have Linear Motion on vertical walls too incase if it is required.
- For Autonomous Mode we have to design a custom enabled flight controller (for this i will use Joop Brokking work, the link to his work is mentioned in the Work Attribution section ) which can be programmed to trigger a microcontroller ( like Arduino which can control the ON-OFF of the UV Light via Relays) and also control the movement of the Bi-copter.
- Working Of The UV Light -On the top of it UV lights (with Aluminium foil as a reflector) a Servo Motor is installed on the top of it for changing the angle of reflection which will also change the area being sanitized. This will also be helpful when someone accidentally enters the room.
The reason for using Aluminium foil as a reflector is that UV light ( In commercially available UV light UVC is used which has wavelength of 100-280 nm )is absorbed to an extend by the the mirror ( Because of Amorphous nature of Mirror, at atomic level the particles of Silica or Silicon-dioxide is arranged in irregular pattern which results in un-neccessary deflection and absorbtion of low wavelength light) that only about 60-70% gets reflected so i used Aluminium foil from which i will get better results as compared to mirror.
Working Of The Safety Feature:We also have to keep in mind that the exposure of UVC Light to Human beings or any other living beings is harmful since as discussed above it also has the capacity to destroy Human Cells So for this I designed an A.I. based Smart Person Detector which when detect a Human Being will automatically send a signal to the Microcontroller(Arduino Pro-Mini) which when being recieved a HIGH input from the Jetson Nano will shut down the Power of UV Light with the help of a Relay ( we assumed that a person will enter in the room by the door not window or other source & this custom object tracking in this case door will be achieved by NVIDIA Transfer Learning Toolkit )
Steps to track people on camera :- Install OpenCV
OpenCV is the open source computer vision library, and it's super powerful.we'll install it through Anaconda. I assume that you have already installed anaconda for python 3.X.Add the following packages to anaconda: opencv numpy matplotlib
conda install opencv numpy matplotlib
- Accessing your webcam
Reading from your webcam is easy with OpenCV, just write the following script and run it with python:
import numpy as np
import cv2
cv2.startWindowThread()
cap = cv2.VideoCapture(0)
while(True):
# reading the frame
ret, frame = cap.read()
# displaying the frame
cv2.imshow('frame',frame)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cap.release()
cv2.destroyAllWindows()
cv2.waitKey(1)
You should see a window pop up with the image from your webcam:
Now let's try and manipulate the video stream. The video is read frame by frame, so we can edit the frame before displaying it. Add the following lines before displaying the frame:
# turn to greyscale:
frame = cv2.cvtColor(frame, cv2.COLOR_RGB2GRAY)
# apply threshold. all pixels with a level larger than 80 are shown in white. the others are shown in black:
ret,frame = cv2.threshold(frame,80,255,cv2.THRESH_BINARY)
People detectionOpenCV features an implementation for a very fast human detection method, called HOG (Histograms of Oriented Gradients).
This method is trained to detect pedestrians, which are human mostly standing up, and fully visible
# import the necessary packages
import numpy as np
import cv2
# initialize the HOG descriptor/person detector
hog = cv2.HOGDescriptor()
hog.setSVMDetector(cv2.HOGDescriptor_getDefaultPeopleDetector())
cv2.startWindowThread()
# open webcam video stream
cap = cv2.VideoCapture(0)
# the output will be written to output.avi
out = cv2.VideoWriter('output.avi',cv2.VideoWriter_fourcc(*'MP4'),1.5,(640,480))
while(True):
# Capture frame-by-frame
ret, frame = cap.read()
# resizing for faster detection
frame = cv2.resize(frame, (640, 480))
# using a greyscale picture, also for faster detection
gray = cv2.cvtColor(frame, cv2.COLOR_RGB2GRAY)
# detect people in the image
boxes, weights = hog.detectMultiScale(frame, winStride=(8,8) )
boxes = np.array([[x, y, x + w, y + h] for (x, y, w, h) in boxes])
for (xA, yA, xB, yB) in boxes:
cv2.rectangle(frame, (xA, yA), (xB, yB),
(0, 255, 0), 2)
# Write the output video
out.write(frame.astype('uint8'))
# Display the resulting frame
cv2.imshow('frame',frame)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cap.release()
cv2.destroyAllWindows()
cv2.waitKey(1)
With this Code the BI-COBOT will be able to turn OFF the UV Lights whenever a person is detected while disinfecting the area.
Comments