"While it maybe incredibly tempting to snag a sugary treat from your co-worker's stash, think of the number of hands that dug into the candy dish in search of sugar fix."
Now you can get your candies, with nothing being touched at all.
Introduction:Whenever you go to an office, there is always candy on the table, but we never really know it's really for us. Also, if you know amount of hands that has been digging into the candy jar, you'd probably not taking a candy to start with. To make this a more welcoming experience, we can use proximity Sensor to utilize the mechanical arm to deliver the candy to us personally.
At time of this writing there is no guide on interfacing Kemet SS-430 with Raspberry Pi, so this will be one of the first article to do so. We can integrate this with an ADC, or just a Grove Pi Shield which already have ADC built in.
With a clean SD card, we'd first need to download Raspberry Pi to the mini SD card so we can use Pi as our operating system.
https://www.raspberrypi.org/downloads/
With Our OS installed, we can now see the Pi running
Alternatively, the Ubuntu Mate would also work for this project. But for this example we will focus on the Pi.
Step 2: Install Kemet Sensor into Raspberry PiNow that Pi is installed, we'd need the driver to get install the Grove Shield as all the ADC and other hardwares are being installed on there. You can do that via
git clone https://github.com/DexterInd/GrovePi
cd GrovePi
curl -kL dexterindustries.com/update_grovepi | bash
After that we'd need to flash the latest hardware
cd Firmware
bash firmware_update.sh
The pin assignment is from right to left, with pin 1 being power, pin 5 being GND and pin4 is Vout.
The wiring to the Pi shield should look like below, as Pi Shield already includes ADC (Analog Digital Converter) that we need
Or look like this in picture
As SS-430 Lens is ultra sensitive, so we need a lens around to make this work
The sensor itself is extremely sensitive, if you even stand around it would pick up the signal like pulse and reacts like following.
Fortunately GeoNomad has built a 3D printable lens case for the unit where we can play around with our lens cap
We've used acrylic, glass, but it seems simple scotch tape itself will do the trick to make the much more stable. Make sure you do 100% fill.
After this, the sensor becomes a lot more accurate based on reading. And the pulse we are seeing are completely gone.
After that we can run the following code to get the sensor values through Raspberry Pi.
import time
import grovepi
#Sensor connected to A0 Port
sensor = 14 # Pin 14 is A0 Port.
led = 4
grovepi.pinMode(sensor,"INPUT")
grovepi.pinMode(sensor,"OUTPUT")
while True:
try:
sensor_value = grovepi.analogRead(sensor)
print ("sensor_value = %d" %sensor_value)
if sensor_value > 200:
grovepi.digitalWrite(led,1)
else:
grovepi.digitalWrite(led,0)
time.sleep(.5)
except IOError:
print ("Error")
You should be able to see the sensor value coming from the Kemet SS-430
This part is a bit tricky, as Dobot does not have official support for Pi, but however, I did find https://github.com/nanusefue/dobotMagician being able to use the device. We need to install following
git clone https://github.com/nanusefue/dobotMagician.git
sudo apt-get install qt5-default
sudo apt-get install qtcreator
sudo apt-get install libqt5serialport5-dev
After that, we should be able to see the basic control over the Dobot Arm using Raspberry Pi by using Scirpt.py --Json data2.json
Technically we can train the arm to pick up and do just about anything, but for this guide, let's use it to pick up candies. In order to do that we'd first have to install Grip which is part of Dobot's kit. GP3 is connected to Arm's connector #1
When it's done we can look at the back like this.
We can train the robot movement from DobotStudio on Windows, this would give us the exact movement and save it into playback file.
Save the file as xml file and transfer it to raspberry pi, run the following command to make it into a json file
parserv2.py -Xml candydemo.playback -json candydemo.json
Next we can program the raspberry pi's python file to do the same which can create the similar movement.
import threading
import DobotDllType as dType
import time
import argparse
import json
from pprint import pprint
from collections import OrderedDict
class ScriptRobot():
global CON_STR
CON_STR = {
dType.DobotConnect.DobotConnect_NoError: "DobotConnect_NoError",
dType.DobotConnect.DobotConnect_NotFound: "DobotConnect_NotFound",
dType.DobotConnect.DobotConnect_Occupied: "DobotConnect_Occupied"}
def __init__(self,Json):
self.Json=Json
self.api=dType.load()
self.state=""
def Connect(self):
#Connect Dobot
self.state = dType.ConnectDobot(self.api, "", 115200)[0]
dType.GetDeviceSN(self.api)
dType.GetDeviceName(self.api)
dType.GetDeviceVersion(self.api)
dType.GetDeviceWithL(self.api)
dType.GetPoseL(self.api)
dType.GetKinematics(self.api)
#dType.GetHOMEParams(self.api)
print("Connect status:",CON_STR[self.state])
if (self.state == dType.DobotConnect.DobotConnect_NoError):
dType.SetQueuedCmdClear(self.api)
return True
else :
dType.DisconnectDobot(self.api)
return False
"""
def _MOVJ(self,data):
dType.SetPTPCmd(self.api,dType.PTPMode.PTPMOVJXYZMode,float(data['X']),float(data['Y']),float(data['Z']),float(data['R']), isQueued = 1)
def _MOVL(self,data):
dType.SetPTPCmd(self.api,dType.PTPMode.PTPMOVLXYZMode,float(data['X']),float(data['Y']),float(data['Z']),float(data['R']), isQueued = 1)
def _ARC(self,value,data):
dType.SetARCCmd(self.api,[float(data['X']),float(data['Y']),float(data['Z']),0],[float(data['_X']),float(data['_Y']),float(data['_Z']),0], isQueued = 1)
def moveTypes(self,value,data):
if value=="MOVJ" :
return self._MOVJ(data)
elif value=="MOVL" :
return self._MOVL(data)
elif value=="ARC" :
return self._ARC(value,data)
"""
def ParserMove(self):
dType.SetQueuedCmdClear(self.api)
json_data = open(self.Json)
data = json.load(json_data, object_pairs_hook=OrderedDict)
#def SetPTPCoordinateParams(api, xyzVelocity, xyzAcceleration, rVelocity, rAcceleration, isQueued=0):
for move in data:
#print "TEST_:"+data[move]['MotionStyle'],data[move]['Row']
if data[move]['PauseTime']!=0:
lastIndex=dType.SetWAITCmd(self.api,float(data[move]['PauseTime']), isQueued = 1)[0]
if data[move]['MotionStyle']=="MOVJ" :
lastIndex=dType.SetPTPCmd(self.api,dType.PTPMode.PTPMOVJXYZMode,float(data[move]['X']),float(data[move]['Y']),float(data[move]['Z']),float(data[move]['R']), isQueued = 1)[0]
dType.SetEndEffectorSuctionCup(self.api, 1, 0, isQueued = 1)
if data[move]['MotionStyle']=="MOVL" :
lastIndex=dType.SetPTPCmd(self.api,dType.PTPMode.PTPMOVLXYZMode,float(data[move]['X']),float(data[move]['Y']),float(data[move]['Z']),float(data[move]['R']), isQueued = 1)[0]
if data[move]['MotionStyle']=="MOVJANGLE" :
lastIndex=dType.SetARCCmd(self.api,[float(data[move]['X']),float(data[move]['Y']),float(data[move]['Z']),0],[float(data[move]['_X']),float(data[move]['_Y']),float(data[move]['_Z']),0], isQueued = 1)[0]
dType.SetQueuedCmdStartExec(self.api)
while lastIndex > dType.GetQueuedCmdCurrentIndex(self.api)[0]:
# dType.GetPose(self.api) Obtener la Posicion actual del robot
dType.dSleep(100)
dType.SetQueuedCmdStopExec(self.api)
dType.GetKinematics(self.api)
dType.DisconnectDobot(self.api)
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Script Robot')
parser.add_argument('--Json', required=True, help='File name export json')
args = parser.parse_args()
R = ScriptRobot(args.Json)
R.Connect()
R.ParserMove()
When success, we can see something like this straight from Raspberry Pi by running following command. Note: you may have to add
<item_6>0.0</item_6>
<item_7>0.0</item_7>
<item_8>0.0</item_8>
<item_9>0.0</item_9>
into the playback file manually, because the automatically generated ones does not have anything empty. After that, copy the file into PythonMove folder run the following
$ python3 Script.py --Json candydemo.json
We now can use Sensor to trigger the bot, which at this point we can also trigger the LED which can tell us the whether the robotic arm is being activated through sensor.
sensor_value = grovepi.analogRead(sensor)
print ("sensor_value = %d" %sensor_value)
if sensor_value > 200:
count += 1
if sensor_value <= 200 and last_value > 200:
count = 0 #reset the count
if count > 2:
count = 0
digitalWrite(led,1) # Send HIGH to switch on LED
print('called robotic arm')
os.system('python3 Script.py --Json candybot.json')
digitalWrite(led,0) # Send LOW to switch off LED
last_value = sensor_value
Let me explain this code a little bit, we are trying to get 2 readings before activating the robotic arm, this is to ensure there is no false positive readings. We can call the Script.py using os.system('python3 Script.py --Json candybot.json') which activates the bot itself like previous step.
Step 6: Tracking your candies through firebaseSince Pi is already connected to wifi, we can build a database on Google's firebase on how many candies people have been taking from your stash. For this example we will be using firebase-admin
We first need to sign up for for firebase at https://console.firebase.google.com/
When that's done we should receive following message, Google Analytics is optional.
After that's done let's create a database for those who takes our candy
And once the database is created you should have following
You can also get your sesrvice account key json from Service accounts under Project setting which is needed for the python sdk
Next we need to install python-admin
$ pip install firebase-admin
To access the record, add following python code into our project
from datetime import datetime
import firebase_admin
from firebase_admin import credentials
from firebase_admin import db
cred = credentials.Certificate("'path/to/serviceAccountKey.json'")
# Initialize the app with a service account, granting admin privileges
firebase_admin.initialize_app(cred, {
'databaseURL': 'http://yourfirebase.firebaseIO-demo.com/'
})
ref = db.reference('/candy')
ref.set({'count': 1, 'time': datetime.now().strftime("%d-%b-%Y (%H:%M:%S.%f)")})
When this is done, we should be able to see the candy being stored into firebase through wifi network, and you can track all of your co-workers candy based on timers.
Now all the hard work is done, it's demo time! You can build this project and make your customers feel more welcome by giving them the candies this way. Make sure your wifi is connected to count
Comments