I built a proof of concept device to solve accessibility in robotics using augmented reality. I used an Arduino kit and Facebook AR Studio to control a littleBits Droid. My goal is to encourage everyone to think about Industry 4.0 as expanding the reach of technology to enable more people to participate in this digital transformation.
StoryWhen I first learned about the Arduino / Distrelec: Automation & Robotics Contest here on Hackster, I started researching about Industry 4.0
How can we advance our robots and manufacturing systems using augmented reality technologies. How can augmented reality help everyone. Then I started to think about the word 'EVERYONE'....
Everyone is you... and me and that guy, that girl, that person? Everyone means us including minorities, including people with disabilities. Globally.
According to disabiltystatistics.org: In the year 2016 here in United States, an estimated 12.8 percent (plus or minus 0.05 percentage points) of non-institutionalized, male or female, all ages, all races, regardless of ethnicity, with all education levels in the United States reported a disability.
I like what Satya Nadella, CEO of Microsoft said: "Accessibility is an amazing example. We live in a golden age where technology can enable more people to participate in our technology and in our society."
DroidsI started to look into robots we have at home. What is available out there that my kids can play today? Rather than solving worlds problem, I started to look around me. What robotic system I have at home?
Last Christmas, my kids got this Christmas present. Here's a video of my son building the littleBits Droid Kit. It's really an interesting device to teach kids about electronics and programming. The tutorial is very kid friendly. I highly recommend it.
When I saw this droid, it got me thinking how to improve this. There are kids out there that have disability that were left out and can't use this droid. How can I, as a parent and as an engineer, improve this device. How can I make this accessible?
My research and projects that I post here on Hackster focuses on convergence of Internet of Things, augmented reality, and artificial intelligence. The promises of Ubiquitous Computing. It's where we use IoT to collect data, AI to find patterns from our data and augmented reality to visualize and interact with our data.
Facebook AR StudioFor the past few months, I have been studying Facebook AR Studio. It's a tool to create augmented reality experiences on the Facebook platform. It's a way to bring imagination to life with cutting-edge creative tools. An easy way to share what you build with your friends and followers worldwide. It's a tool to augment the world around you using trackers, data, animation and more to create interactive, shareable effects that respond to people and objects in their surroundings.
Creativity, that's the key word. In order to advance our robotic systems, we need creative tools around us. We need to be creative.
We need to augment our world to make it accessible. Accessibility for EVERYONE! Person with less cognitive ability can easily understand data and benefit with Ubiquitous computing.
Then it hit me.
How can augmented reality and artificial intelligence make Industry 4.0 flexible to accommodate the needs of person with disability.
How can I use augmented reality to control a Droid, a robotic system?
littleBitsInstead of using the Droid Control Hub Bit, I replaced with Arduino Bit and Wireless Transmitter and Wireless Receiver. The Wireless Receiver connects to the motors of Droid.
I uploaded the Standard Firmata to the Arduino. I used Johnny-Five to communicate with littleBits Arduino.
Preparing the ArduinoI followed this tutorial from this GitHub link.
Johnny-Five communicates with Arduino using the Firmata protocol, so you'll need to install Firmata on Arduino as a once-off step before you can start programming the board:
- Download the Arduino IDE
- Connect the Arduino module to the computer using USB
- The module does not get power via USB so you'll also need to connect a (blue) power module to the Arduino module to any of the 3 inputs on the Arduino module (d0, a0 or a1)
- Open the Arduino IDE and select 'Arduino Leonardo' under the Tools > Board menu
- Select the Serial port for your board under Tools > Serial Port. It will look something like /dev/tty.usb(...) on Mac, /dev/ttyUSB(...) on Linux or COM... on Windows.
- Open File > Examples > Firmata > StandardFirmataPlus
- Click the 'Upload' button the send the Firmata program to the Arduino
- Wait until the status bar at the bottom of the Arduino IDE window says 'Done uploading' then close the Arduino IDE. Your Arduino is ready to go!
Here's a quick introduction to Node-RED:
I installed Node-RED on my machine. I followed the instructions here.
https://nodered.org/docs/getting-started/installation
After installation, I opened the browser and navigated to user settings -> Palette. I installed the node-red-node-arduino Node.
Here's some information about Arduino Node
https://flows.nodered.org/node/node-red-node-arduino
It will add these nodes on your palette
I assembled them together to create an api in Node-RED. Here's what it looks like.
The http node looks like this. The link will have http://website/command/{name}
Here's the snippet of command on the function node.
The commands are - forward, stop, reverse, left, right, straight, scream. Each one would send the right payload to the motor and the turning wheel.
command = msg.payload;
var motor = { payload:0 };
var turn = { payload:0};
// set the payload to the level and return
switch(command)
{
case "forward":
motor.payload = 255;
turn = null;
break;
case "stop":
motor.payload = 110;
turn = null;
break;
case "reverse":
motor.payload = 0;
turn = null;
break;
case "left":
motor.payload = 130;
turn.payload = 135;
break;
case "right":
motor.payload = 130;
turn.payload = 45;
break;
case "straight":
motor.payload = 110;
turn.payload = 90;
break;
case "scream":
return [null, null, {"payload" : true}]
default:
motor.payload = 110;
turn.payload = 90;
}
return [motor, turn,{ "payload" : false}];
In case you're wondering what the scream command do, it's a digital signal to turn on the MP3 bit to play R2D2 screaming sound.
I exported all the Node-RED nodes and it's on the link below. To import, go to the hamburger menu, Import->Clipboard. Paste the code and click Import button.
For me, this is one of the easiest way to teach someone what a web API means. creating an http node that serves up commands to do something to an arduino.
To test this, you can actually go to your command shell and run this command
> curl http://localhost:1880/command/forward
The result would look like this
{ "Action": "forward" }
In order to make my Node-RED API accessible from the web, I use ngrok to expose a port. You can download ngrok from here.
To run ngrok, type
> ./ngrok http 1880
This will open up a port that looks something like this
Session Status online
Account Ron Dagdag (Plan: Free)
Version 2.2.8
Region United States (us)
Web Interface http://127.0.0.1:4040
Forwarding http://4a471498.ngrok.io -> localhost:1880
Forwarding https://4a471498.ngrok.io -> localhost:1880
Connections ttl opn rt1 rt5 p50 p90
0 0 0.00 0.00 0.00 0.00
Now you can type
> curl http://4a471498.ngrok.io/command/forward
{ "Action": "forward" }%
Facebook AR StudioCurrently, the Facebook AR Studio only runs on the Mac. I'm not sure when are they going to release the Windows version of Facebook AR Studio. You can download it here.
https://developers.facebook.com/products/ar-studio
Here's a link to a quick tutorial guide.
https://developers.facebook.com/docs/ar-studio/tutorials/quick-start-guide
Here are some basics of scripting in AR Studio
https://developers.facebook.com/docs/ar-studio/tutorials/basics-of-scripting
I'm going to skip how I created helmet. Follow the video tutorials how to use Facebook AR Studio.
While researching in AR Studio documentation, I found out the Facebook AR networking capabilities. I found out that it has REST endpoint capabilities to fetch and post data.
https://developers.facebook.com/docs/ar-studio/reference/networking_module
Here's the script I used
//Header
// Created by Ron Dagdag
//Copyright 2018-present
//All rights reserved.
//This source code is licensed under the license found in the
//LICENSE file in the root directory of this source tree.
var Animation = require('Animation');
var FaceTracking = require('FaceTracking');
var Scene = require('Scene');
const Reactive = require('Reactive');
const Networking = require('Networking');
const Diagnostics = require('Diagnostics');
const FaceGestures = require('FaceGestures');
var server = 'http://4a471498.ngrok.io';
var urlForward = server + '/command/forward';
var urlStop = server + '/command/stop';
var urlReverse = server + '/command/reverse';
var urlLeft = server + '/command/left';
var urlRight = server + '/command/right';
var urlStraight = server + '/command/straight';
var face = FaceTracking.face(0);
var ft = Scene.root.child("Device").child("Camera").child("Focal Distance").child("facetracker0");
var lightning = ft.child("lightningForward");
lightning.hidden = FaceGestures.hasMouthOpen(face).not();
var lightningLeft = ft.child("lightningLeft");
lightningLeft.hidden = FaceGestures.isLeanedLeft(face).not();
var lightningRight = ft.child("lightningRight");
lightningRight.hidden = FaceGestures.isLeanedRight(face).not();
var lightningReverse = ft.child("lightningReverse");
lightningReverse.hidden = FaceGestures.hasEyebrowsRaised(face).not();
FaceGestures.isLeanedLeft(face).monitor().subscribe(function(changedValue) {
Diagnostics.log(changedValue);
if (changedValue.newValue) {
Diagnostics.log('Left Face!');
Networking.fetch(urlLeft);
} else{
Networking.fetch(urlStraight);
}
});
FaceGestures.isLeanedRight(face).monitor().subscribe(function(changedValue) {
if (changedValue.newValue) {
Diagnostics.log('Right Face!');
Networking.fetch(urlRight);
} else{
Networking.fetch(urlStraight);
}
});
FaceGestures.isLeanedBack(face).monitor().subscribe(function(changedValue) {
if (changedValue.newValue) {
Diagnostics.log('Forward Face!');
Networking.fetch(urlStraight);
} else{
Networking.fetch(urlStop);
}
});
FaceGestures.hasEyebrowsRaised(face).monitor().subscribe(function(changedValue) {
if (changedValue.newValue) {
Diagnostics.log('Eyebrow Raised!');
Networking.fetch(urlReverse);
} else
{
Networking.fetch(urlStop);
}
});
FaceGestures.hasEyebrowsFrowned(face).monitor().subscribe(function(changedValue) {
if (changedValue.newValue) {
Diagnostics.log('Eyebrow Frowned!');
Networking.fetch(urlForward);
} else
{
Diagnostics.log('Eyebrow Normal!');
Networking.fetch(urlStop);
}
});
FaceGestures.hasMouthOpen(face).monitor().subscribe(function(changedValue) {
if (changedValue.newValue) {
Diagnostics.log('Mouth open!');
Networking.fetch(urlForward);
} else {
Diagnostics.log('Mouth closed!');
Networking.fetch(urlStop);
}
});
The code is written in javascript. There's also a way to do this in the Patch Editor.
Each time a mouth was open, it will call the urlForward get command.
To make sure that the URL is accepted, Go to Project -> Edit Properties... -> Capabilities -> Networking
Add ngrok url to whitelisted domains. This will allow Facebook AR Studio to call the site.
If you want to test it out inside Facebook AR Studio, click the RUN icon, this enables script to run. You'll see some messages in the log. You will also see requests ngrok is receiving.
Here's the video:
What I LearnedThru our simple things experiments and POC, we can inspire builders, makers, technologists to improve our technologies to enable more people to participate in our society. Industry 4.0 is not just about making the industry faster, cheaper, more flexible, and efficient. It's also about expanding the reach of technology to help people with disabilities participate in this digital revolution.
If this project made you interested in augmented reality, littleBits Droid, Facebook AR Studio, ngrok, Node-RED especially in Arduino, click the Thumbs Up button and Follow my projects. Ping me if you want to see to incorporate this to Amazon Echo and make the Droid Scream and dance.
Comments