Hackster is hosting Hackster Holidays, Ep. 6: Livestream & Giveaway Drawing. Watch previous episodes or stream live on Monday!Stream Hackster Holidays, Ep. 6 on Monday!

Body Tracking on a Budget

Using a smartphone, smartwatch, and earbuds, IMUPoser can track a person's full-body position, eliminating the bulk of traditional systems.

Nick Bild
2 years agoMachine Learning & AI
Estimating full-body position with minimal instrumentation (📷: V. Mollyn et al.)

Full-body motion tracking technologies have many useful applications, from virtual reality and video gaming to sports training and Hollywood movies. But the adoption rate of body tracking technologies has been underwhelming, to say the least. This is due, in large part, to the fact that, despite the technology having many useful applications, present implementations are generally cumbersome and impractical for most use cases.

One type of full body motion tracking technology utilizes cameras and depth sensors to monitor movements in three dimensions. The system is highly accurate and can detect even subtle movements, making it suitable for various applications. However, its range is limited, and it can only track movements within a certain distance from the sensors. And of course, the hardware needs to be installed wherever it is needed before it can be used.

Another full body motion tracking system uses infrared cameras and markers placed on the body to track movements in real-time. It is very accurate and ideal for applications such as sports training and medical rehabilitation. However, the system is complex and expensive, and requires that one wear special clothing, making it less accessible to the general public.

But sometimes, high-precision measurements are not needed. And if close is good enough for an application, then recent work by a team at Carnegie Mellon University shows how full-body motion tracking can be achieved with a minimal set of instrumentation — in fact, you may be carrying all of the necessary instrumentation around with you right now.

Using just a smartphone, smartwatch, and wireless earbuds, the team has demonstrated that it is possible to predict a person’s body position with reasonable accuracy. With virtually all devices of this sort now coming standard with motion detecting sensors like accelerometers and gyroscopes, they can provide a wealth of information about the 3D position of their owner.

But the convenience of this approach does come with some drawbacks. These types of motion sensors are generally of lower quality, and produce noisy data. And of course having just a few points of measurement around the body makes it challenging to predict a full-body position with certainty — cameras and marker-based systems certainly have a great advantage here.

To help not only translate the sensor measurements into a full-body position prediction, but also to help overcome these challenges, the researchers turned to a machine learning-based approach that they call IMUPoser. They developed a two-layer Bidirectional LSTM neural network that accepts orientation and acceleration data from sensors, and translates it into a best guess of the user’s body position. It was designed to work with varying amounts of information — it could, for example, make predictions using only data from a smartphone — but naturally, the more information there is available, the better the predictions will be.

The system is designed with sparse data in mind, so even with three measurement points, there is still a lot of missing information to fill in. IMUPoser makes some assumptions to help with this gap in observations. It assumes, for example, that if the user is walking, then the untracked leg will be following a normal walking gait pattern based on the information available from the tracked leg. Such assumptions do not lend themselves to a high degree of certainty in the predictions, but in general, they do hold true.

At present, all of the devices need to be part of the same technological ecosystem (e.g. Apple) for IMUPoser to work correctly, but the team hopes that will not be the case in the future as industry-wide standards become more commonplace. The researchers also note that only the most capable device (i.e. the smartphone) of all the devices in use is capable of acting as the processing unit, which makes that one particular element indispensable. They see this as a problem that will disappear in the future as other wearables increase in available computational resources. In spite of these limitations, there is still a lot of potential in the system, and it will be interesting to see what new full-body tracking applications will be enabled by IMUPoser and methods like it.

The dataset, architecture, trained models, and visualization tools have all been made open source by the team, so feel free to download them if you would like to experiment with full-body tracking on your own.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles