Hackster is hosting Hackster Holidays, Ep. 6: Livestream & Giveaway Drawing. Watch previous episodes or stream live on Monday!Stream Hackster Holidays, Ep. 6 on Monday!

Blind Man Develops AI-Based 3D Modeling and Printing Workflow

Redditor Mrblindguardian, a blind man, developed an AI-based workflow that lets him design custom 3D models for 3D printing.

Cameron Coward
11 months ago3D Printing

I’ve always thought of 3D modeling — and by extension, 3D printing — as a visual medium. While 3D-printed objects are certainly physical, the entire software chain that leads to them exists solely in the digital world. So my assumption was that, unfortunately, this hobby is not viable for people that live with visual impairments. But Redditor Mrblindguardian proved me wrong by developing an AI-based workflow that lets him model and 3D print his own custom designs, such as a one-winged dragon.

In addition to the obvious challenges, this comes with some difficulties that our sighted readers may not be aware of. We have language to describe what we see, but that doesn’t hold the same meaning to people who have never been able to see.

For example, consider a question posed by William Molyneux in 1688: “Could a blind person, upon suddenly gaining the ability to see, recognize an object by sight that he'd previously known by feel?”

In 2011, researchers at MIT answered that question by testing the premise in the real-world using subjects that received sight-restoration procedures. The results showed that tactile understanding did not carry over to the visual world. This should give you some insight into the challenges Mrblindguardian faced.

His solution is ingenious and takes advantage of AI tools that only recently became available. Mrblindguardian starts by typing out a description of what he thinks a dragon looks like, with the help of googled descriptions. He then uses Luma AI’s Genie service to generate a 3D model based on that description.

To verify that the model “looks” right without the ability to see it, Mrblindguardian takes screenshots of the generated 3D model and feeds those to ChatGPT to describe. If the AI-generated description matches his expectations, then he knows that the model looks right—at least to ChatGPT. If it doesn’t, he can refine his Luma AI Genie prompt and repeat that process until the results are satisfactory.

With a suitable STL file, Mrblindguardian can then use slicing software that is compatible with screen readers. To get a better sense of what’s on screen, he can also have ChatGPT generate descriptions from screenshots. One he’s happy with the results, Mrblindguardian can ask a sighted friend to verify that the file is ready to print. If so, he can print it and then process it by feel.

This is a laborious process, but it works. Mrblindguardian used it to 3D-print this custom one-winged dragon, bringing a creature from his imagination into the real-world where he can feel it himself.

I can’t help but feel tremendously inspired and impressed by Mrblindguardian’s achievement, and I hope that others are able to take advantage of this workflow to produce their own designs.

Cameron Coward
Writer for Hackster News. Proud husband and dog dad. Maker and serial hobbyist. Check out my YouTube channel: Serial Hobbyism
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles