Introduction
Every day, millions of visually impaired people face invisible barriers — from finding the right classroom to telling apart two identical cans. The problem isn’t lack of technology; it’s that most tools are either too outdated, expensive, or inaccessible. Did you know that only 10% of visually impaired individuals understand braille?
We’re building a near-field communication (NFC) labelling system that can make any venue visually-impaired friendly in no time. Using Raspberry Pis as the core of our system, a camera module, and ultra-cheap NFC stickers, existing spaces can be transformed into instantly accessible environments.
What it does
- Staff or volunteers simply point a camera at existing text, images, or braille.
- The Raspberry Pi captures the image and sends it to Claude Haiku 4.5, which uses tailored prompts (depending on whether the device is in braille, image, or text mode) to extract and interpret the content into a clear, structured description.
- Staff can review AI-generated descriptions using physical accept/reject buttons — no keyboard, screen, or technical expertise required.
- Once approved, the description is written onto an NFC tag. These tags can be placed beside artwork, fresh produce, medication boxes, safety signage, or room doors — instantly turning everyday objects into accessible touchpoints.
- Blind users carry a simple wearable NFC “wand” that reads the tag aloud or sends it to a Bluetooth device. The wand calls ElevenLabs’ text-to-speech model to generate natural speech — no screens, apps, or configuration needed.
How we built it
To make the system practical in real-world settings, we built it into a custom laser-cut chassis housing for the Raspberry Pi and an RC522 NFC reader/writer module.
Some key engineering steps included:
- Connecting multiple Raspberry Pis together to handle camera input, NFC writing, and AI communication.
- Making speech-to-text work seamlessly via a speaker and integrating ElevenLabs’ TTS model.
- Ensuring all parts of the system worked together reliably — from image capture to AI processing to NFC tag reading.
In our demo, we walked judges through a hackspace where chairs, food items, and safety notices were all labelled with our tags — proving you can make almost any space accessible in an afternoon using simple stickers and a handful of electronics :)
Challenges we ran into
- Coordinating communication between multiple Raspberry Pis in real time.
- Optimizing AI prompts for different content types (text, images, braille) to ensure accurate descriptions.
- Integrating TTS and NFC hardware to work without user intervention.
- Designing an intuitive physical interface for staff to approve/reject descriptions.
What's next for Conductor
- Warehouses, to make storage areas accessible.
- Healthcare settings, so medications and equipment can be easily identified.
- Public spaces, including museums, markets, and classrooms, to make them instantly inclusive.
Log in or sign up for Devpost to join the conversation.