Hackster is hosting Hackster Holidays, Ep. 6: Livestream & Giveaway Drawing. Watch previous episodes or stream live on Monday!Stream Hackster Holidays, Ep. 6 on Monday!

Braille-tip Turns a Pen Into a Tactile Reading Machine, Designed to Help Teach Braille

Run across a line of Braille, the Braille-tip will transform it into spoken words — making it easy for people to learn independently.

Researchers at the University of Bristol have developed a pen that works in reverse — offering a sense of touch fine enough to read Braille, using a sensor system built in a 3D-printed mold.

"This device, Braille-tip, was designed to aid people's ability to learn independently, and will hopefully form part of the solution to increasing Braille literacy and allow people to reap the benefits of reading and writing," says lead author George Jenkinson of his team's work. "I used the handheld device to read multiple passages of Braille, and analyzed how accurately it could process the tactile cues (Braille bumps) into English text."

Braille, a tactile writing system developed by Louis Braille in the 1820s, uses a series of raised bumps to encode text for the blind and partially sighted. In addition to printed books and magazines, Braille is found in signage, on product labels, and can even be used as a computer interface using electromechanical Braille displays. The problem: it can be tricky to learn, requiring the user to differentiate between small bump patterns using only the tip of their finger.

The Braille-tip is designed to help those learning Braille do so independently. Using a camera-based microfluidic sensor system, built from silicone and tubing using a 3D-printed mold and attached to a pen, the Braille-tip is sensitive enough to pick up the bumps of Braille writing and transfer the signals to a computer for translation into spoken letters and words. In its prototype form, it proved capable of distinguishing 52 Braille dots with an 84.5 percent success rate — which Jenkinson is confident can be boosted in the future.

"The pattern of the errors suggest that they came from the way the device was held and operated, suggesting that the algorithm and sensor are likely to be able to reach much higher accuracy close to 100 percent if the design is improved," Jenkinson claims. "As soon as possible, the device should be tested with participants, and prototypes should be made available to the intended end-users so that their desires and the potential use for such a device can be assessed in earnest. “A co-design approach that involves users is much more likely to have a positive real-world impact than an approach siloed in the laboratory."

The team's work has been presented at the IEEE EMB 10th International Conference on Biomedical Robotics and Biomechatronics; a copy can be requested from the University of Bristol, while the project' source code and STL files have been made available on GitHub under an unspecified license.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles