Demo instructions:
https://github.com/StevenSchrembeck/puppeteer
I believe VR and AR will begin to take center stage of our digital experience. It will be common to meet people in a virtual space, whether for entertainment, work, or activities we usually associate with the physical world, like shopping and sports.
Even in its current early state there are a truckload of haptic gadgets intended to more deeply immerse us in our digital worlds.
One sense that's missing from these early experiences is a sense of whereyourbodyis and whatit'sdoing. This sense is called "proprioception". It's how you know where your hands are even when you're not looking. Our bodies use a number of clues to generate this intuitive sense, but many of those inputs are missing in VR.
Using the external brain interface, Neosensory Buzz, we can pipe this missing data to your brain (albeit indirectly, through your skin nerves) and create a more complete sense of proprioception in a virtual world.
But what does that mean, exactly?When you put a VR headset, it means we can give you an intuitive sense of where yours hands are (though we could do this for any body part, Puppeteer is focused on hands initially) -- even when they aren't in view. But, as we'll see, that's only scratching the surface of a concept I'm calling:
synthetic proprioception
How does Puppeteer work?Puppeteer is 2 pieces of hardware and a software library.
1. The Ultraleap Leap Motion hand tracker
2. The Neosensory Buzz wristband
The software library connects live hand tracking data (shown below), transforms it with a special encoding algorithm, then pipes that to the Buzz as vibrations at about 15 frames per second.
If you're running the demo, you can run it without a hand tracker by running it in "gesture mode" which replays re-recorded gestures (like the thumbs up seen below).
You'll be able to feel the vibration pattern unique to that hand gesture. I won't go into the details here, but the PCA encoding has very little reconstruction error, meaning it sends very high resolution hand position data.
If you want to train a truly intuitive sense of synthetic proprioception you'll want to strap the hand tracker to your head somehow. There are fancy VR head mounts for this (Ultraleap tracking is also being built directly into the next generation of Qualcomm VR chips), or you can use my sophisticated DIY approach:
Enhancing immersion of human hands in VR is neat, but not earthshaking. There are potential applications for people with naturally poor proprioception, which might manifest as being exceptionally clumsy or having inaccurate motor control when not staring at one's hands. It's obviously applicable to those with prosthetic appendages as well.
Admittedly, this incarnation of the project is not well-suited for these problems. The setup is clunky and wired. Anyone except the most afflicted people would find this augmented proprioception not worth the effort.
While Puppeteer sends regularhumanhand data to the Buzz, there's no reason why it has to be limited in that way. It was simply a convenient way for me to experiment. In my opinion, the real prize is non-human synthetic proprioception.
What if you don’t have human hands in a digital world? You might have crab claws, or a dozen spidery fingers, or even four arms and a tail. What would it be like to intuitively know where your tail is, in the same way that you know where your hands are now?
When your digital body doesn’t match your physical body, synthetic proprioception will be required to maximize immersion and intuitive control. It could be used to enhance vehicle and tool control in the real world. There's some indication that's how we use tools today. A long stick is modeled like an extension of your arm, to an extent.
There's no reason why our digital avatars need to mirror our real life bodies. Synthetic proprioception will be a critical sense when that happens.
Lastly, I can imagine a world in which your instructional videos are enhanced with a new channel (beyond sound and images) for limbandfingerposition. How much easier would it be to repair a car, learn to embroider, or practice a dance move if you could simply be sent an intuitive sense of where your limbs and fingers should go?
Certainly a lot easier than just watching the video.
Where Puppeteer goes from hereThere are a number of hurdles to getting Puppeteer into the real world as an actual, usable product (beyond early adopting hackers like you).
1. Training time
This is a problem shared by every new sense trained through vibration. Puppeteer can be trained fairly simply in a passive way just by going about your day (as long as you're near a computer, since this is wired right now) and running the software with your headband and Buzz on.
2. Hand tracking accuracy
The Ultraleap leap motion controller is good. Quite good, compared to most other solutions, but it still has a lot of problems. When fingers are occluded, or move in and out of the visible field rapidly, there are bizarre tracking issues. Still, it's definitely good enough for everything except fine motor tracking.
3. Hardware cost
VR hardware + BCI haptics is a pretty large expense. The initial customers will need to be VR enthusiasts who already invest heavily in other immersion enhancing gadgets.
4. Resolution
The Buzz only has 4 motors and 255 vibration intensities. This is enough to map the most overt hand and finger position data, but it's not enough to do fine detail or include other body parts.
All that said, synthetic proprioception has a lot going for it. It's actually easier to send virtual point cloud data (whether it's your hand, tail, or crab claw) than track a real world hand and map that back to the digital world. Given that most games already have this positional data available, integrating this hardware is not a big ask of game developers or VR software developers.
As more and better tracking software comes prepacked in VR hardware, the task of synthetic proprioception becomes easier and easier. There's every reason to believe this problem becomes significantly easier with each passing month.
I see this an integration play, rather than standalone software. I also sense that the time isn't right, just yet, for the reasons above. Now's the time to experiment and understand what is possible. In just a few years there may be a niche market for this kind of novel sense.
Comments