Inspiration: We wanted to create a game that could be enjoyed by everyone — especially people who are often excluded from gaming due to vision impairments. Instead of adding accessibility as an afterthought, we made it the foundation. What it does: EchoQuest is a voice-controlled, story-driven adventure game designed for blind and visually impaired players. It uses natural narration, branching storylines, and real-time speech recognition so players can guide the plot without ever seeing a screen. A minimal, high-contrast interface supports sighted assistants. How we built it: We developed EchoQuest using HTML, CSS, and JavaScript, integrating the Web Speech API for voice recognition and speech synthesis. The story was written to be immersive in audio format, with clear decision points and intuitive voice commands. Challenges we ran into: Making the narration feel natural and engaging. Ensuring voice recognition works accurately in different environments. Creating a UI that supports sighted helpers without distracting from the audio experience. Accomplishments that we're proud of: Fully functional voice-controlled game that can be played without visuals. An original, branching story designed for accessibility. A simple yet inclusive UI for mixed-ability gameplay. What we learned: Designing for accessibility requires rethinking the whole user experience. Voice interaction can be powerful when combined with compelling storytelling. Inclusive design benefits all users, not just the target audience. What's next for EchoQuest: Adding more chapters and storylines. Expanding language support for global accessibility.

Built With

Share this project:

Updates