What happens when you deploy Visionect's stack to the Raspberry Pi? A Webkit instance on a server that renders websites and sends the resulting images to simple e-paper devices over the air. The device is just the interface, the server does all the heavy lifting.
Can the Raspberry Pi handle it?
Visionect's frontend wizard, Matevz, decided to find out. He found Raspberry Pi boards inside Cubesensors, mixed two of the most interesting Slovenian hardware startups together, and barely broke a sweat doing it.
Cubesensors
Cubesensors won a Launch award last year because they are small, cute, and awesome. A tiny plastic cube with a Raspberry Pi board that you put on a table and it will help you make the room a better place.
There's a bunch of sensors in there - humidity, noise, air pollution, vibration, everything you can think of. Normally these cubes create a mesh network using ZigBee and talk to an app on your phone to say "Hey, you know what, maybe you should open a window."
Matevz hacked one of them to run Visionect's server software, become a Wi-Fi access point, and report straight to the V tablet without any need for internet or extra devices.
V tablets are especially well suited for this purpose since they use the awesome e-paper display from E Ink (just like your ebook reader). This means you'll be able to mount the device on a sunny porch and run it on battery power for weeks as E-paper displays offer awesome sunlight readability at super low power consumption.
Result: self-contained Visionect stack on a Raspberry Pi with a bunch of sensors for environmental data, all being displayed on the awesome e-paper based V Tablet.
Visionect+Cubesensors+node.js == win
Contrary to what you might expect, shoving a server that usually takes a beefy processor and many rams onto the 700MHz Raspberry Pi was completely uneventful. It just worked.
- install some dependancies
- make deb packages for Visionect things
- use
gvm
to update Go and get the admin panel working - write a log parser for sensor logs
- make a simple static file server
- write some code to display data
- hook up the tablets
And that's it. That's all it took. Imagine my disappointment when I was expecting a juicy tale of trial and error, of heroic hacking and triumph. Everything just ... worked.
Matevz says the hardest part was creating those deb packages, but you don't have to do that anymore. You can just run sudo apt-get install koala
.
If you ever need to make a deb package, Matevz says he liked this seven step guide. Says it was the least painful.
Parsing log files
Because there's still no official API, Matevz had to do some inventive hacking - parsing ZigBee logs.
When Cubesensors talk to each other, they leave a trail in /var/log/ziggy.stdout.log
. I'm not sure why, but it's useful when you want to read the sensors every ten seconds.
Logs look like this:
Parsing those into a useful form is a simple matter of using node's Tail
package and applying a regex to every line.
Simple.
Displaying the data
Once your data is in a friendly JSON format, you have to slap together a simple dashboard-like app for the V tablet.
You'll need a static server first, but you're using node.js to parse logs anyway, so a server is just a node-static away. Or you can do it the old fashioned manual way like Matevz did.
But he's a frontend guy so we forgive him :)
Most of the interface is just a bunch of logic for selecting which cube to look at and going back from a specific display. fetchData
, which fetches data and displays it, is far more interesting.
It's called every ten seconds and looks like this:
Mmmmm, maths. And magic numbers. To be honest, I'm not at all certain how those conversions work or what any of the numbers mean, but the end result is a human readable display of temperature, noise, humidity, light, air pressure, and air quality in a room.
You can get the full source code, here.
Hooking up the tablets
Time to turn your Raspberry Pi into a WiFi hotspot for the tablets.
You'll need a wi-fi dongle, some more deb packages, and a bit of configuration. Matevz says he used the TP Link TL-WN722N
dongle, but I'm sure you can use whatever's lying around the house. Especially if it's based on the Atheros chipset.
Then you have to add deb http://mirrordirector.raspbian.org/raspbian/ wheezy non-free
to /etc/apt/sources.list
and run apt-get install firmware-atheros isc-dhcp-server hostapd
to install all the packages.
After that, some configuration:
Then tell the Raspberry Pi to turn into an access point every time it boots with two commands:
update-rc isc-dhcp-server defaults
update-rc hostapd defaults
And voila. A self contained system where a bunch of sensors inside a plastic box hooked up to a Raspberry Pi talk to a palm-sized e-paper tablet.
Well, you have to tell the tablet to connect to this access point as well. But that's trivial - go into settings, input the right IP addresses. Then don't forget to visit Visionect admin panel at http://192.168.20.1:8150
and tell it which URL to serve. For the cubesensors experiment this was http://192.168.20.1:8888
.
Do it yourself
Let's recap. To turn your Raspberry Pi into a Visionect server you have to:
- add Visionect's deb packages to sources.list (packages.visionect.si)
- get a wifi dongle
- convince your Pi to act as an access point (previous section)
- tell the tablet where to connect
And you're done. Matevz made a room sensor display thingy on a V Tablet with an E Ink e-paper display, what are you going to build?
Swizec Teller is a geek with a hat. He's helping us create an ongoing showcase of interesting projects from our community. You should follow him on twitter, here.
Comments
Please log in or sign up to comment.