Problem:
Meet Raghu, an employee of a food processing company. His job requires him to travel to his branch office 20km away from his house. His mode of transport is the bus. After the breakout of COVID-19 virus, Raghu and his family are having a tough time to keep their ends meets as Raghu can not work from home due to the nature of his fieldwork. Few days passed, and it was time for Raghu to get back to work without complaining about his insecurities caused due to the pandemic. On his way to the office, he could barely maintain any social distancing as the buses were crowded. He tends to forget about using a glove or handkerchief before touching any surface like doorknobs, atm machine buttons etc. Few more days later, his colleague tested out positive with whom he worked side-by-side in his office. And eventually, even he tested out positive. From careful observations, it clearly shows that its high time for a solution which can eliminate these issues caused.]
Solution:
Our Personal Protective Body Camera is one exciting project we would like to introduce which will come handy to anyone during this pandemic. Call this pen as your companion as it is a small portable device with a camera hanging on your shirt pocket or collar always keeping an eye on peoples, equipment & environment. The camera in this pen is equipped with identifies the objects user interacts with using object detection algorithm PyTorch compiled with Neo-DLR(Deep learning runtime). The device generates logging data which includes the location coordinates of the user for tracing the route the user takes in order to keep him well aware of his actions which may get him in contact with an infected person and alerts him to reconsider his actions by vibrating. This edge computing solution makes it a unique device capable of tracing all contacts including equipment the user made on his route
Let's have a look at an alternate situation where Raghu is prepared with our Personal Protective Body Camera. After the breakout of COVID-19 virus, Raghu and his family are having a tough time to keep their ends meets as Raghu can not work from home due to the nature of his fieldwork. Few days passed, and it was time for Raghu to get back to work. This time, he was well aware of his surroundings as he got alerted by the Personal Protective Body Camera when he was about to touch a doorknob without any protection. He avoided the routes which increased his chances of having contact with an infected person. And also, he kept his distance from his colleague, Mark. Raghu recommended the Personal Protective Body Camera to Mark. Now, Mark is equipped with this device and his routes and interactions are being well traced for the people who can keep their distance from him. This solution has not only protected Raghu, but also the family and friends of both Raghu and Mark and all the strangers they passed by till now.
in case if a user of Personal Protective Body Camera got tested positive to COVID still government can utilize the log files to control the spread of COVID from a user.
Log details from user Camera:
Has both prediction results & GPS coordinates
DEBUG:root:Inference result: Timestamp 15/07/2020 10:40:26, Latitude: 12.6621425,longitudes: 77.8171855, laptop, laptop computer
DEBUG:root:Inference result: Timestamp 15/07/2020 10:40:53, Latitude: 12.662142,longitudes: 77.817181, table lamp
DEBUG:root:Inference result: Timestamp 15/07/2020 11:22:40, Latitude: 12.662167833,longitudes: 77.817264167, wing
DEBUG:root:Inference result: Timestamp 15/07/2020 11:31:31, Latitude: 12.662209901,longitudes: 77.817312094, car mirror
DEBUG:root:Inference result: Timestamp 15/07/2020 11:31:41, Latitude: 12.662212462,longitudes: 77.817314872, home theater, home theatre
DEBUG:root:Inference result: Timestamp 15/07/2020 11:35:57, Latitude: 12.662194667,longitudes: 77.817324333, car mirror
DEBUG:root:Inference result: Timestamp 15/07/2020 11:36:16, Latitude: 12.6621955,longitudes: 77.817324667, projector
DEBUG:root:Inference result: Timestamp 15/07/2020 11:38:29, Latitude: 12.662182833,longitudes: 77.817338167, joystick
DEBUG:root:Inference result: Timestamp 15/07/2020 11:38:47, Latitude: 12.662179167,longitudes: 77.8173395, dishwasher, dish washer, dishwashing machine
DEBUG:root:Inference result: Timestamp 15/07/2020 12:55:26, Latitude: 12.662128333,longitudes: 77.817477, jack-o'-lantern
DEBUG:root:Inference result: Timestamp 15/07/2020 12:55:36, Latitude: 12.6621275,longitudes: 77.817473833, jack-o'-lantern
DEBUG:root:Inference result: Timestamp 15/07/2020 13:20:37, Latitude: 12.662152021,longitudes: 77.817301373, toilet seat
DEBUG:root:Inference result: Timestamp 15/07/2020 13:20:56, Latitude: 12.662143454,longitudes: 77.81727728, toilet seat
DEBUG:root:Inference result: Timestamp 15/07/2020 13:30:44, Latitude: 12.662135,longitudes: 77.8174855, sliding door
DEBUG:root:Inference result: Timestamp 15/07/2020 13:31:05, Latitude: 12.6621305,longitudes: 77.817497333, sliding door
DEBUG:root:Inference result: Timestamp 15/07/2020 13:31:23, Latitude: 12.662135667,longitudes: 77.817484, sliding door
DEBUG:root:Inference result: Timestamp 15/07/2020 14:43:08, Latitude: 12.662271973,longitudes: 77.817298504, cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM
DEBUG:root:Inference result: Timestamp 15/07/2020 14:47:05, Latitude: 12.662230108,longitudes: 77.81729993, cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM
DEBUG:root:Inference result: Timestamp 15/07/2020 14:53:31, Latitude: 12.6621635,longitudes: 77.817331667, cab, hack, taxi, taxicab
DEBUG:root:Inference result: Timestamp 15/07/2020 15:03:18, Latitude: 12.662167123,longitudes: 77.817323697, school bus
DEBUG:root:Inference result: Timestamp 15/07/2020 15:10:19, Latitude: 12.662195167,longitudes: 77.817306167, window shade
DEBUG:root:Inference result: Timestamp 15/07/2020 15:10:24, Latitude: 12.662189833,longitudes: 77.817310167, window shade
DEBUG:root:Inference result: Timestamp 15/07/2020 15:10:30, Latitude: 12.662187,longitudes: 77.817309167, window shade
DEBUG:root:Inference result: Timestamp 15/07/2020 15:20:15, Latitude: 12.662201338,longitudes: 77.81728407, window shade
DEBUG:root:Inference result: Timestamp 15/07/2020 15:20:21, Latitude: 12.662205493,longitudes: 77.817279308, window shade
DEBUG:root:Inference result: Timestamp 15/07/2020 15:23:46, Latitude: 12.662190393,longitudes: 77.817313152, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:24:22, Latitude: 12.662193589,longitudes: 77.817314088, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:24:56, Latitude: 12.662181129,longitudes: 77.817317604, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:29:10, Latitude: 12.662172874,longitudes: 77.817336136, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:29:46, Latitude: 12.662174935,longitudes: 77.817339282, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:30:20, Latitude: 12.662169882,longitudes: 77.81735998, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:37:05, Latitude: 12.662178833,longitudes: 77.817332167, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:37:21, Latitude: 12.662179333,longitudes: 77.8173315, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:37:35, Latitude: 12.662179833,longitudes: 77.8173305, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:43:05, Latitude: 12.662137505,longitudes: 77.817406509, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:43:20, Latitude: 12.66214128,longitudes: 77.817379865, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:43:35, Latitude: 12.662150312,longitudes: 77.817359969, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:48:20, Latitude: 12.662165108,longitudes: 77.817357511, washbasin, handbasin, washbowl, lavabo, wash-hand basin
DEBUG:root:Inference result: Timestamp 15/07/2020 15:49:06, Latitude: 12.662169512,longitudes: 77.817346222, washbasin, handbasin, washbowl, lavabo, wash-hand basin
DEBUG:root:Inference result: Timestamp 15/07/2020 15:53:24, Latitude: 12.662167049,longitudes: 77.817410581, toilet tissue, toilet paper, bathroom tissue
DEBUG:root:Inference result: Timestamp 15/07/2020 15:53:49, Latitude: 12.662165091,longitudes: 77.817409529, toilet tissue, toilet paper, bathroom tissue
DEBUG:root:Inference result: Timestamp 15/07/2020 15:54:14, Latitude: 12.662165042,longitudes: 77.817383799, toilet tissue, toilet paper, bathroom tissue
DEBUG:root:Inference result: Timestamp 15/07/2020 15:58:26, Latitude: 12.661977051,longitudes: 77.817537249, pay-phone, pay-station
DEBUG:root:Inference result: Timestamp 15/07/2020 15:58:41, Latitude: 12.661982399,longitudes: 77.817544919, pay-phone, pay-station
DEBUG:root:Inference result: Timestamp 15/07/2020 15:58:57, Latitude: 12.661985346,longitudes: 77.817537843, pay-phone, pay-station
DEBUG:root:Inference result: Timestamp 15/07/2020 16:02:24, Latitude: 12.662165447,longitudes: 77.817300737, dining table, board
DEBUG:root:Inference result: Timestamp 15/07/2020 16:02:29, Latitude: 12.662204127,longitudes: 77.817205863, dining table, board
DEBUG:root:Inference result: Timestamp 15/07/2020 16:02:36, Latitude: 12.662239617,longitudes: 77.817155975, dining table, board
DEBUG:root:Inference result: Timestamp 15/07/2020 16:05:16, Latitude: 12.662130237,longitudes: 77.817389768, plate
DEBUG:root:Inference result: Timestamp 15/07/2020 16:05:22, Latitude: 12.662136548,longitudes: 77.817368135, plate
DEBUG:root:Inference result: Timestamp 15/07/2020 16:05:26, Latitude: 12.662119108,longitudes: 77.817370161, plate
Projects Details:Step 1: Configuring Raspberry pi environment
- Download & install Raspberry Pi OS (32-bit) Lite & install packages for using Webcam fswebcam
- Install the Amazon SageMaker Neo Deep Learning Runtime, Download package neo-ai-dlr and install
- Configuring AWS IoT Greengrass Setting up a Raspberry Pi (Optional)
- Configure Optimized Machine Learning Inference Using the AWS Management Console (optional)
Step 2: Amazon Sagemaker and Neo
Now it's time for Training your object detection machine learning model using pytorch_torchvision with resnet datasets.
Run the following code to compile your model for Neo Deep Learning runtime
!~/anaconda3/envs/pytorch_p36/bin/pip install torch==1.2.0 torchvision==0.4.0
Import ResNet18 from TorchVision
import torch
import torchvision.models as models
import tarfile
resnet18 = models.resnet18(pretrained=True)
input_shape = [1,3,224,224]
trace = torch.jit.trace(resnet18.float().eval(), torch.zeros(input_shape).float())
trace.save('model.pth')
with tarfile.open('model.tar.gz', 'w:gz') as f:
f.add('model.pth')
Invoke Neo Compilation API
import boto3
import sagemaker
import time
from sagemaker.utils import name_from_base
role = sagemaker.get_execution_role()
sess = sagemaker.Session()
region = sess.boto_region_name
bucket = sess.default_bucket()
compilation_job_name = name_from_base('TorchVision-ResNet18-Neo')
model_key = '{}/model/model.tar.gz'.format(compilation_job_name)
model_path = 's3://{}/{}'.format(bucket, model_key)
boto3.resource('s3').Bucket(bucket).upload_file('model.tar.gz', model_key)
sm_client = boto3.client('sagemaker')
data_shape = '{"input0":[1,3,224,224]}'
target_device = 'ml_c5'
framework = 'PYTORCH'
framework_version = '1.2.0'
compiled_model_path = 's3://{}/{}/output'.format(bucket, compilation_job_name)
response = sm_client.create_compilation_job(
CompilationJobName=compilation_job_name,
RoleArn=role,
InputConfig={
'S3Uri': model_path,
'DataInputConfig': data_shape,
'Framework': framework
},
OutputConfig={
'S3OutputLocation': compiled_model_path,
'TargetDevice': target_device
},
StoppingCondition={
'MaxRuntimeInSeconds': 300
}
)
print(response)
# Poll every 30 sec
while True:
response = sm_client.describe_compilation_job(CompilationJobName=compilation_job_name)
if response['CompilationJobStatus'] == 'COMPLETED':
break
elif response['CompilationJobStatus'] == 'FAILED':
raise RuntimeError('Compilation failed')
print('Compiling ...')
time.sleep(30)
print('Done!')
# Extract compiled model artifact
compiled_model_path = response['ModelArtifacts']['S3ModelArtifacts']
congratulations you have successfully compiled your pytorch_torchvision model for Neo Deep Learning runtime. Now you export your compiled model form your S3 bucket to Raspberry pi.
You should find this following file formated in compiled mode .meta .parama .so compiled_model.json.
Step 3: Running Compiled model in Raspberry pi.
Make sure you have exported your compiled model from S3 bucket to Raspberry pi. Now use this following code to run your compiled mode.
# import libraries
import time
import os
from PIL import Image
from dlr import DLRModel
import numpy as np
import utils
import logging
from sys import argv
import gps
import requests
from datetime import datetime
#now = datetime.now()
#dt_string = now.strftime("%d/%m/%Y %H:%M:%S")
#print("date and time =", dt_string)
#Listen on port 2947 of gpsd
session = gps.gps("localhost", "2947")
session.stream(gps.WATCH_ENABLE | gps.WATCH_NEWSTYLE)
list = ['ATM','automated teller machine', 'sliding door', 'dog', 'bicycle','bikini','cab','hack','taxi','taxicab','cellphone','cellular telephone','coffee mug','computer keyboard','keypad','desktop computer','dial telephone','school bus','gasmask','gas helmet','jeep','landrover','laptop','mailbox','mask','motor scooter','scooter','oxygen mask','pay-phone','restaurant','eating place','ski mask','toilet seat','trailer truck','vending machine','washbasin','handbasin','washbowl','wash-hand basin','toilet tissue','bathroom tissue','laptop','mouse']
import RPi.GPIO as GPIO # Import Raspberry Pi GPIO library
from time import sleep # Import the sleep function from the time module
GPIO.setwarnings(False) # Ignore warning for now
GPIO.setmode(GPIO.BOARD) # Use physical pin numbering
GPIO.setup(12, GPIO.OUT, initial=GPIO.LOW) # Set pin 8 to be an output pin and set initial value to low (off)
global locate
logging.basicConfig(filename='test-dlr.log', level=logging.DEBUG)
#current_milli_time = lambda: int(round(time.time() * 1000))
def run_inference():
now = datetime.now()
dt_string = now.strftime("%d/%m/%Y %H:%M:%S")
print("date and time =", dt_string)
for x in range(4):
rep = session.next()
try :
if (rep["class"] == "TPV") :
print(str(rep.lat) + "," + str(rep.lon))
locate=("Timestamp "+ dt_string +","+" Latitude: " + str(rep.lat) + "," +"longitudes: " + str(rep.lon))
except Exception as e :
print("Got exception " + str(e))
print(time)
#os.system('fswebcam -r 1024x768 --no-banner --scale 224x224 output.jpg -S 7 --save /home/pi/Photos/std.jpg') # uses Fswebcam to take picture
image = Image.open('output.jpg')
#data = np.array(image,dtype='float64')
#data=data1.reshape((1,data1.shape[2],data1.shape[0],data1.shape[1]))
#np.save( 'flamingo.npy', data)
image_data = utils.transform_image(image)
#print(image_data)
flattened_data = image_data.astype(np.float32).flatten()
#np.save( 'puppi.npy',flattened_data)
#print("Start Prinring Flattern")
#print(flattened_data)
#run_inference(image_data)
#time.sleep(15) # this line creates a 15 second delay before repeating the loop
model_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), '../model-rasp3b')
batch_size = 1
channels = 3
height = width = 224
input_shape = {'input0': [batch_size, channels, height, width]}
classes = 1000
output_shape = [batch_size, classes]
device = 'cpu'
model = DLRModel(model_path, input_shape, output_shape, device)
synset_path = os.path.join(model_path, 'imagenet1000_clsidx_to_labels.txt')
with open(synset_path, 'r') as f:
synset = eval(f.read())
#image = np.load(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'dog.npy')).astype(np.float32)
#input_data = {'data': image_data}
# Predict
out = model.run({'input0' : flattened_data}).squeeze()
top1 = np.argmax(out)
prob = np.max(out)
# Using for loop
for i in list:
# How to use find()
if (synset[top1].find(i) != -1):
print ("Contains given substring ")
GPIO.output(12, GPIO.HIGH) # Turn on
sleep(10) # Sleep for 1 second
GPIO.output(12, GPIO.LOW) # Turn off
#sleep(10)
#else:
#print ("Doesn't contains given substring")
#print(i)
print("Class: %s, probability: %f" % (synset[top1], prob))
#while True: # Run forever
#GPIO.output(8, GPIO.HIGH) # Turn on
#sleep(10) # Sleep for 1 second
#GPIO.output(8, GPIO.LOW) # Turn off
#sleep(10)
#for rep in range(4):
#t1 = current_milli_time()
#out = model.run({'input0' : flattened_data}).squeeze()
#t2 = current_milli_time()
#logging.debug('done m.run(), time (ms): {}'.format(t2 - t1))
#top1 = np.argmax(out)
print(locate)
logging.debug('Inference result: {}, {}'.format(locate, synset[top1]))
#import resource
#logging.debug("peak memory usage (bytes on OS X, kilobytes on Linux) {}".format(resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
#return {
#'synset_id': top1,
#'prediction': synset[top1],
#'time': t2 - t1
#}
if __name__ == '__main__':
while True :
res = run_inference()
#cls_id = res['synset_id']
print("All tests PASSED!")
Output 1: The output from the Raspberry pi.
If equipment like ATM, doorknobs are detected the vibration motor will start vibrating
Output 2: The complete location tracking along with user contacts with equipment will be saved in a log file.
logfile output:
DEBUG:root:Inference result: Timestamp 15/07/2020 15:20:21, Latitude: 12.662205493,longitudes: 77.817279308, window shade
DEBUG:root:Inference result: Timestamp 15/07/2020 15:23:46, Latitude: 12.662190393,longitudes: 77.817313152, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:24:22, Latitude: 12.662193589,longitudes: 77.817314088, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:24:56, Latitude: 12.662181129,longitudes: 77.817317604, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:29:10, Latitude: 12.662172874,longitudes: 77.817336136, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:29:46, Latitude: 12.662174935,longitudes: 77.817339282, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:30:20, Latitude: 12.662169882,longitudes: 77.81735998, gasmask, respirator, gas helmet
DEBUG:root:Inference result: Timestamp 15/07/2020 15:37:05, Latitude: 12.662178833,longitudes: 77.817332167, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:37:21, Latitude: 12.662179333,longitudes: 77.8173315, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:37:35, Latitude: 12.662179833,longitudes: 77.8173305, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:43:05, Latitude: 12.662137505,longitudes: 77.817406509, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:43:20, Latitude: 12.66214128,longitudes: 77.817379865, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:43:35, Latitude: 12.662150312,longitudes: 77.817359969, vending machine
DEBUG:root:Inference result: Timestamp 15/07/2020 15:48:20, Latitude: 12.662165108,longitudes: 77.817357511, washbasin, handbasin, washbowl, lavabo, wash-hand basin
DEBUG:root:Inference result: Timestamp 15/07/2020 15:49:06, Latitude: 12.662169512,longitudes: 77.817346222, washbasin, handbasin, washbowl, lavabo, wash-hand basin
DEBUG:root:Inference result: Timestamp 15/07/2020 15:53:24, Latitude: 12.662167049,longitudes: 77.817410581, toilet tissue, toilet paper, bathroom tissue
DEBUG:root:Inference result: Timestamp 15/07/2020 15:53:49, Latitude: 12.662165091,longitudes: 77.817409529, toilet tissue, toilet paper, bathroom tissue
DEBUG:root:Inference result: Timestamp 15/07/2020 15:54:14, Latitude: 12.662165042,longitudes: 77.817383799, toilet tissue, toilet paper, bathroom tissue
Comments