This project is entitled for the development of a fully functional mobile prototype "Rover Station”, responsible for environmental data capture as Temperature, Humidity and Luminosity. The idea is in the future add other functions/parts to this prototype to really get a Mars Rover emulator. This prototype is for education purposes only and was part of my Capstone Project at Coursera - University of California, Irvine "An Introduction to Programming the Internet of Things (IOT)" course.
User & Design considerations:- The Rover will be remote controlled by an Android device with Bluetooth capability. The intention is that the station will be moving into a path where the data must be captured. The data will be continuously been captured and transmitted independently if the Rover is stationary or in movement.
- The user should receive a visual feedback (live video stream) from the Rover
- The captured data would be analyzed thru a public website (in this case: thingspeak.com)
- The data will be available to users on a graphic and table format
- Pre-defined tweet alarms will be generated locally by the station or WebSite
- The Rover will have some autonomous capability to avoid obstacles in order to protect itself in case of bad controlling by user.
- Common ”off of the shelf” components will be used and the total cost should be under USD150
Based on the project requirements, 2 options were considered for this project. A single processor responsible for all tasks, that in this case was a Raspberry Pi and a dual processor, having the requirements "split" between them (Arduino and RPi):
1. Processor 1: RPi-2
- Responsible for data capture
- Web communication
- Streaming Video
- Send social media messages
2. Processor 2: Arduino
- Motors Control (movement and camera positioning)
- Obstacles avoidance
- Remote Control communication
In terms of costing, using 2 processors is in fact less expensive than the single processor option. This is because the Arduino is a very cheap item and less expensive that the Rpi Hat option necessary to run de Servers with RPi. Another difference is the BT module. For Arduino a very cheap HC-06 BT 3.0 slave module can be used, costing half of the price of the BT dongle to be added to RPi. So, the dual processor was the chosen option.
Step 2: BoM - Bill of MaterialInstall PIP:
sudo apt-get install python-pip
Install the picamera library:
pip install picamera
Install the flask Python library:
sudo pip install flask
Download Miguel’s Flask video streaming project:
git clone https://github.com/miguelgrinberg/flask-video-streaming.git
In the project folder edit the app.py
file, comment out this line:
#from camera import Camera
Un-comment this line:
from camera_pi import Camera
Save the file app.py
. Run ifconfig
to find out the local IP address of your Raspberry Pi “yourLocalIPaddress
”. Start the Flask server by running this command:
python app.py
A message will be printed at monitor:
“running on "http://0.0.0.0.:5000/ (press CTRL+C to quit)
Open up a web browser and go this address:
“yourLocalIPaddress”:5000
Step 4: Install DHT11 sensor at RPiFirst, get the Library from Github:
git clone https://github.com/adafruit/Adafruit_Python_DHT.git
Installing the Library:
sudo apt-get update
sudo apt-get install build-essential python-dev python-openssl
cd /Home/Pi/Adafruit_Python_DHT
sudo python setup.py install
Test the sensor running the file AdafruitDHT.py at monitor. Enter as parameters: 11 (Sensor DHT11) and 4 (GPIO where the sensor is connected):
sudo python /Home/Pi/Adafruit_Python_DHT/examples/AdafruitDHT.py 11 4
The result should be the temperature and humidity read by the sensor.
Step 5: Sending Data to WebFor the basic settings of the DH-11 sensor with the RPi, and sending data to internet, a great help was get from this tutorial:
Plotting DHT11 sensor data at ThingSpeak.com using Raspberry Pi
From what was learned, the important is:
- Setting a Channel at ThingSpeak
- Run the Python Code bellow for tests
The general idea for this step of the project was learned from:
A LDR and a Capacitor was connected to GPIO24. If there is light, the GPIO24 will return HIGH and w/o light “LOW”. The Python Code used in this test:
Step 7: Adding a anagog Light Intensity sensorThe next step was to get "light intensity data
". To add the LDR to RPi the best is to convert the analog signal from the sensor to a digital value using an external ADC (Analog to Digital Converter). The RPi does not have an internal ADC as the Arduino. If you do not have an ADC, a good approximation is to use a capacitor charging/discharging technic. The "Raspberry Pi Cookbook" gives the solution (note that Instead the Potentiometer, a LDR could be used):
import RPi.GPIO as GPIO
import time
GPIO.setmode(GPIO.BCM)
a_pin = 25
b_pin = 23
def discharge():
GPIO.setup(a_pin, GPIO.IN)
GPIO.setup(b_pin, GPIO.OUT)
GPIO.output(b_pin, False)
time.sleep(0.005)
def charge_time():
GPIO.setup(b_pin, GPIO.IN)
GPIO.setup(a_pin, GPIO.OUT)
count = 0
GPIO.output(a_pin, True)
while not GPIO.input(b_pin):
count = count + 1
return count
def analog_read():
discharge()
return charge_time()
while True:
print(analog_read())
time.sleep(1)
The best is to use the Arduino to capture this kind of info and send it to RPi. The result will be more accurate.
Step 8: Sending all data to the webThe previous Python Code was updated, including the new sensors:
Step 9: Sending an alarm tweetOne of the characteristics of IoT is to interact with users automatically. You can program the RPi as a WebServer to send Tweets directly or use the feature that the Website ThingSpeak has (but of course in the last case, only messages "triggered" by a condition based on the data captured and uploaded will be send). Python code:
from twython import Twython
C_KEY = "xxxxxxxxxxxx"
C_SECRET = "yyyyyyyyy"
A_TOKEN = "zzzzzzzzzzzz"
A_SECRET = "wwwwwwwww"
api = Twython(C_KEY, C_SECRET, A_TOKEN, A_SECRET)
api.update_status(status="IoT Capstone Project - Tweet test")
Note that your Twitter account must permit that you send a tweet from the RPi. Also you must got the KEYs from Twitter in order to use the TWYTHON library available for RPi.
Another simple solution as explained before, is to send a Twitter directly from the WebSite. In this case the ”React” feature of ThingSpeak.com can be used.
Step 10: Connecting the RPi and the Arduino using Serial CommunicationThe Arduino used was the NANO that is powerful as the UNO but on a "small form factor. For tests purposes, A potentiometer was connected to Arduino analog port A0 and the value transmitted via Serial to the RPi. In the test, RPi will be reading a Keyboard (connect to it or thru VNC) and depending on the command, the Arduino LED will be toggle "ON/OFF".
Bellow the Arduino and Python code used on the tests:
Step 11: Sending Arduino data to the WebOnce the Arduino is connected with RPi, additional data captured by Arduino can also be sent to the Web, together with other data captured by the DH11 and LDR. The Python code used to send data to the Website was changed to also included the data captured by Arduino (potentiometer value).
Step 12: Testing the Rover motorsAt this point, the Rover will start to be assembled. I decided to disassemble all sensors and start from zero the "Arduino Phase
". Once the Rover was properly working, the RPI and sensors were reassembled.
For motors, 2 continuous servos (SM-S4303R) were used. Those servos will run with a speed depending of the pulse width received on its data input.
- For this servo, the pulse width goes from 1.0ms to 2.0ms (other servos can work with different pulse width)
- A pulse of 1.5ms will position the servo at Neutral position, or ”stopped”.
- A pulse of 1.0ms will command the servo to full speed (around 70 RPM) in one direction and 2.0ms full speed in the opposite direction.
- Pulse between 1.0 and 1.5ms or 1.5ms and 2.0ms, will generate proportional speed.
The first thing that must be done, it is send a 1500ms pulse to verify if the motors are "stopped". If not, the servos must be adjusted to full stop (look for the yellow bolt, bellow the servo). Of course if your servo does not have this adjustment, try change the "1500ms" value until you get the full stop.
#include
Servo leftServo;
Servo rightServo;
Void setup()
{
leftServo.attach(6);
rightServo.attach(5);
leftServo.writeMicroseconds(1500);
rightServo.writeMicroseconds(1500);
}
void loop()
{
}
The code below, can be used for a complete Rover motor test (forward, Backward, Full stop, turn Left, turn right). If necessary you must adjust the delays for the required turn angle depending of your motors (also, sometimes left and right pulse values should be a little bit different to compensate any lack of balance of the motors.
Step 13: Assembling the Rover structure and adding Remote ControlFirst of all, note that the Robot or "Rover" is a prototype for educational purposes and was built with elastic band, wood and clips jointing the original parts. It is very simple, but works fine for what it is intended.
For the Remote Control, an Android device was the chosen one, because it is very easy to develop an app using the MIT AppInverntor2. For this project, I developed the app shown in the photo.
At the Arduino, a HC-06 bluetooth module was used. You need more information about how to use this module, please see my tutorial:
Connecting "stuff" via Bluetooth / Android / Arduino
The complete Arduino code (the previous one + BT) is available at file below (do not forget that the 3 files, must be inside an unique folder:
Step 14: The Android appThe app is very simple. It has:
- 5 buttons for direction control (FW, BW, Left, Right, Stop). When each of those buttons are pressed, a character is sent via Bluetooth to the HC-06 and the command is executed by the Arduino.
- 1 Slider for Camera movement. A numeric value is sent from 0 to 100. This value will be "mapped" at the Arduino to move the camera depending the Servo angle range (in my case something from 20o to 160o)
- PiCam IP address input. Use the button to store it.
- Send/receive text to Arduino (use the "paper plane" button to send it)
Bellow the .aia
file if you are familiar with MIT AppInverntor2 and the .apk
if you want only install and run the app in your phone.
For obstacle avoidance, an ultrasonic sensor (HC-SR04) will be used. The sensor will be mounted over a 180o servo motor, in order to increase the area to be searched. Note that the servo will be also used as a base for the Pi-Camera. So, the user can have a a bigger view of the area to be researched. A slider at Android app will control the camera angle.
The sensor works sending a sound pulse at trigger pin (2us LOW; 10us HIGH) and register how many microseconds the reflection of the pulse takes to return to echo pin (remember that sound travels at 340m/s). The function ”int distMeter()
” is used for this calculation.
In case an obstacle is found at 20cm (at front), the rover will stop, light ON the LED and run back a few centimeters. The video shows the tests with the Rover. The complete Arduino code (the previous one + obstacle avoidance and search servo control) is available at the files:
Step 16: Final integration RPi + ArduinoAt this stage al individual parts were tested and it functional. Now both, the Rpi and Arduino must be integrated. Note that I only included the LDR that measure the light (not the On-Off).
First, run the Pi-Cam python program at flask directory, testing the camera: sudo python app.py
Check the sensor values (heat the sensor, cover the light sensor, etc). See the result at monitor and at webSite:
sudo python /home/pi/Desktop/Iot_capstone/iot_temp_hum_light_pot_ardu.py
Once you can see the camera video use CTRL-C to liberate the monitor to enter the main Python code.(same as the one used at step 11). Run the Arduino sketch. Move the rover with the Android app and check the video and the sensor values. Check if the rover stop on an obstacle. Monitor the Website and see the environment data continuously been displayed. The video shows the complete prototype being controlled by the Android app, capturing sensor data and showing at the Internet. Below the final Arduino codes:
Step 17: Mars Rover landed in London!I am very happy that this tutorial called the attention of MagPI, the Rapsberry Pi official magazine, that published in its last May issue an interview with me about the project (my name was quoted wrong: "Marcio" instead of "Marcelo").
The full magazine can be download in PDF form the link: MagPi edition 45
Step 18: ConclusionThat is it! If everything works properly, your "Mars Rover emulator" is ready!
Congratulations!
In the future would be interesting to include some features at this project:
- Guide the Rover over internet.
- Addition of a robotic arm, so the Rover could do some mechanical work as obstacles remotion, samples collection, etc.
- Solar panel for power supply.
So, we can really have a more realistic "Mars Rover emulation". Thanks and I hope that this project can help others to learn about Raspberry, IoT, Arduino, Robots, etc. The complete updated code can be find at GITHUB. For more tutorials and tips, please visit my Blog.
Comments