This Giant Nose Art Piece Drips Descriptive Snot
This giant nose, built by Adnan Aga, smells things and drips printed descriptions of the scents out of its nostril.
For every human sense, there is an artificial analog. Image sensors and light sensors approximate sight. Microphones can hear. Force sensors and capacitive sensors mimic touch. But the analogs for taste and smell are much more difficult, because those senses are extremely subjective. An android nose could tell you when it detects a specific chemical compound, but would have a hard time accurately describing the sum of several chemicals all mixed together — as most real world scents are. To get around that, this giant nose art piece avoids sniffing altogether when it prints out smell descriptions.
This art piece, built by Adnan Aga, resembles a schnoz ripped from the face of a humongous silver being. It mounts to a wall above a platform. When a patron of the arts places something on that platform, the nose will go to work. It will give the object a whiff and then a printed description of the object's scent begins pouring out of the left nostril. It works with just about any recognizable object, which is pretty amazing if you're familiar with the limitations of electronic noses.
The secret is that the nose isn't actually smelling anything. It doesn't have any hardware capable of identifying chemical compounds in the air. Instead, Aga got clever and pulled a fast one. Inside the right nostril, there is a camera that points down at the platform. It takes a picture of the object on the table and runs that through image recognition to identify the object. It then passes information about the object to an AI model that generates a poetic description of how the object should smell. It would, of course, be possible to fool the system by disguising a scent with an object that looks different (like a rose coated in bacon grease).
For all of this to work, Aga used a Raspberry Pi 4 Model B single-board computer with a Raspberry Pi Camera, a distance sensor, a thermal printer, and a speaker. The distance sensor lets the system know when an object is on the platform and the speaker plays sound effects. A Python script performs the image recognition using Google Images, then produces the description using GPT-4. Aga 3D-printed the physical nose in sections, then assembled them and painted the result.
Aga built this piece for a graduate thesis at NYU, then put it on display at Olfactory Art Keller in New York's Chinatown.