Inspiration
BrailBox was inspired by the idea that storytelling should be accessible to everyone, including blind and visually impaired readers. Many digital stories are easy to share on screens, but they are not always easy to experience through touch. We wanted to build a creative and educational tool that connects written stories, voice interaction, AI, and physical Braille output in one accessible system.
What it does
BrailBox allows users to submit short written stories through a web app. These stories are checked and organized using AI, then stored in a cloud database. A blind or visually impaired user can interact with the physical BrailleBox device by speaking and asking for a story. The system finds a matching story, converts the text into Braille code, and sends it to the device, which outputs tactile Braille characters that can be read by touch.
How we built it
We built the project using a web app, a cloud database, AI processing, a Raspberry Pi, and physical hardware. The web app collects story submissions and sends them to the backend. AI is used to moderate and classify the stories by theme, age group, and reading level. Approved stories are stored in Supabase.
On the hardware side, the Raspberry Pi listens for a spoken request, converts speech to text, finds the best matching story, converts the selected story into Braille, and controls servo motors to physically display Braille dots. The prototype uses three servo motors to represent part of a Braille cell and demonstrate the core idea of tactile story output.
What we learned
We learned how to connect software and hardware into one complete accessibility-focused system. We also learned more about Braille structure, text-to-Braille conversion, voice input, AI-based classification, cloud databases, Raspberry Pi GPIO control, and servo motor movement. Most importantly, we learned how technology can make storytelling more inclusive when accessibility is considered from the beginning.
Challenges we faced
One major challenge was combining many different parts into one working pipeline: web submission, AI moderation, database storage, voice input, story matching, Braille conversion, and motor control.
Accomplishments
We are proud that BrailBox demonstrates a complete accessible storytelling experience, from writing a story online to physically reading it through Braille output. Even as a prototype, it shows how AI, web technology, and hardware can work together to create a more inclusive way to share stories.
What’s next
In the future, BrailBox could be expanded into a full six-dot Braille cell or a multi-cell Braille display. We would also improve the web interface, add more story categories, support more languages, and make the hardware more reliable and comfortable for real users.


Log in or sign up for Devpost to join the conversation.