This project is about a miniaturization of the Vive Tracker by HTC. It allows sub-millimetric 3D positioning, and embeds a 9DoF IMU with sensor fusion. It's cheaper than other 3D positioning system and allows tracking way more objects. The entire project is open source, all the materials can be found below, or here: http://HiveTracker.github.io
ChallengeIndoor 3D positioning is a problem that has been fought for a while but there is no perfect solution.
The closest available is probably the motion capture system using cameras, it's very expensive and doesn't scale (above a dozen of tracked objects the system lags).
SolutionAfter exploring ultrasound and magnetic approaches, it looks like lasers are not too bad!
We made our own board that uses photo sensors to piggy back on the lighthouses by HTC Vive.
This project gave birth to a first academic publication in the HCI conference called Augmented Human, the published details are very useful.
Anyway, here is a demo:
With backgrounds from HCI (human computer interactions), neurosciences, robotics, and other various engineering fields, we all found that miniature 3D positioning challenge was worth solving.
Some of the main applications we are interested in are listed here:
HTC Vive uses two lighthouses that emit laser signals accurate enough to perform sub-millimeter 3D positioning. The system looks like this:
The following GIF summarizes fairly well the principle:
The idea is to measure timings at which we get hit by this laser sweep.
As we know the rotation speed and as we get a sync pulse broadcast, we can get two angles in 3D from each lighthouse, and estimate our position with the intersection (see details in the logs).
Our approach is to observe these light signals and estimate our position from them.
Trick Alert!Those signals are way too fast to be observed by normal microcontrollers, but we found one (with BLE) that has a very particular feature called PPI, it allows parallel processing a bit as in an FPGA.
Check out the "miniaturization idea validation log page" to learn more about it.
Who Are We?This is a research project made with L❤VE by Vigilante Intergalactic Roustabout Scholars from UCL, UPV and Sorbonne. It stared in the hackerspaces of San Francisco, Singapore, Shenzhen and Paris, then took a new home in academia, where knowledge can stay public.
In ProgressWe're still fighting with a few problems, but as we're open sourcing everything, maybe the community will help?
Please contact us (see below) if you are interested in helping, below are some of the current issues that need some love. We have the solution to all the problems, but more expert hands help can't hurt:
- Kalman filtering: we're using the fusion of the accelerometer integration and optically obtained 3D position to improve our results. Our proof of concept is getting close to usable, now we need to port it.
- Calibration: when using the trackers at first, finding bases positions and orientations is not trivial, we have a procedure but it could be greatly improved...
- Follow us: https://twitter.com/hive_tracker
- Mailing list: https://groups.google.com/forum/#!forum/hivetracker
- Team email: hivetracker+owners@googlegroups.com
Comments
Please log in or sign up to comment.