3D-Printed MechanoBeat Tags Monitor Interactions with No Batteries, Chips, or Electronic Parts
Designed to reflect oscillating patterns in an ultra-wideband radar hooked to a Raspberry Pi, each tag requires no electricity to operate.
Researchers at the Universities of Massachusetts and Maryland have presented a method of 3D printing smart sensors for tracking user interaction with everyday objects — and they can be created on a commodity printer and work without batteries or electronic components.
"We present MechanoBeat, a 3D printed mechanical tag that oscillates at a unique frequency upon user interaction. With the help of an ultra-wideband (UWB) radar array, MechanoBeat can unobtrusively monitor interactions with both stationary and mobile objects."
"MechanoBeat consists of small, scalable, and easy-to-install tags that do not require any batteries, silicon chips, or electronic components. Tags can be produced using commodity desktop 3D printers with cheap materials. We develop an efficient signal processing and deep learning method to locate and identify tags using only the signals reflected from the tag vibrations."
The MechanoBeat tags themselves are harmonic oscillators, and come in two variants: A stationary tag is designed to be fitted to a drawer, door, or cabinet to monitor interaction; a mobile tag, meanwhile, can be fitted to portable objects including pill containers, water bottles, sugar jars and the like to again track interactions.
The tags communicate with an ultra-wideband (UWB) radar sensors, the PulseON 440, connected to a host built from a Raspberry Pi single-board computer with external hard drive for data storage. The radar is able to detect the motion of the pendulum in each task — and, through TensorFlow machine learning, categorize the interaction type.
"MechanoBeat is capable of detecting simultaneous interactions with high accuracy, even in noisy environments," the researchers find. "We leverage UWB radar signals high penetration property to sense interactions behind walls in a non-line-of-sight (NLOS) scenario."
The team's work has been presented at the ACM Symposium on User Interface Software and Technology 2020 (UIST'20), and is available to download under open access terms from the ACM Digital Library.