You Will Be Floored By This

The modular floor-based interface called Flexel uses ML to enable advanced interactions like foot gesture recognition and touch sensing.

Nick Bild
3 years ago β€’ Home Automation
Person localization with Flexel smart flooring (πŸ“·: T. Yoshida et al.)

Sometimes it seems like the most obvious thing is the easiest to overlook. Consider the design of a sensing system for a smart home that tracks the locations of people and objects β€” what types of sensors immediately come to mind? Cameras are probably high on your list, or perhaps radar for those into the more esoteric options. But did you think about the floor right under your feet? Adding load sensors under the flooring would provide the raw data needed to determine these locations, and it would do so in a completely transparent way. Such a solution would also avoid the many privacy-related issues that come with a camera-based approach.

While this is not a very common way of interacting with smart devices, implementations have been created in the past. A major reason why floor-based sensing is so uncommon largely boils down to issues of cost and installability. The fact of the matter is that a lot of sensors are needed underneath the flooring to provide accurate measurements, and that means lots of time spent in installation, and a lot of hardware to be purchased. That may not need to be the case moving forward, however, thanks to recent work done by engineers at The University of Tokyo. They have developed a modular floor interface, called Flexel, for room-scale tactile sensing that only requires about 1% of the sensors used by other floor-based sensing systems.

Beyond localizing humans and objects in a room, the team also wanted to enable more advanced functionality, like foot gesture recognition, footprint tracking, and sensing when a user touches objects. With this goal in mind, twenty off-the-shelf flooring modules, of the sort that are used in building a raised floor, were selected. This is enough to floor a sixteen square foot area. Each of these modules was instrumented with 36 load sensors and a custom PCB containing an Arm Cortex-M7 processor clocked at 600 MHz. The processor aggregates data and transfers it to a host PC via a USB connection for further analysis.

The team was able to achieve high accuracy measurements and advanced sensing like gesture recognition with a small fraction of the sensors typically required by incorporating machine learning into their analysis pipeline. Data was collected from the load sensors and used to train a support vector machine, which was found to be highly accurate. The model that was trained to recognize object locations touched by users, for example, was found to be accurate in 97% of cases on average.

Using their prototype system, the engineers demonstrated several scenarios in which they believe Flexel may be useful. Using the footprint tracking feature, they showed how people passing through a particular location can be counted. Such a device could be useful in assisted living facilities to keep tabs on residents. Gesture recognition was also showcased, which has applications ranging from gaming to fall detection systems. Another interesting demonstration that focused on object localization showed just how sensitive it can be β€” Flexel can even detect when a cup of coffee is placed on top of a table.

Flexel represents an important step towards always available, unobtrusive room-scale interfaces that are economical and practical. There is more work yet to be done, however. For one thing, the team wants to explore how they could also measure horizontal user movements, as Flexel is presently only capable of detecting vertical motion. With any luck, these methods will continue to be refined until floor-based interfaces are a common fixture in our homes and public spaces.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles