Inspiration

Deaf and hard-of-hearing drivers often miss critical environmental sounds like sirens or fire alarms, while many seniors and immigrants face language barriers that make driving interactions stressful or unsafe. These challenges are often treated separately, but in reality they overlap — many people experience both hearing loss and language barriers at once. We wanted to create a hands-free, accessible solution that helps people navigate the world with the same confidence and safety most drivers take for granted.

What it does

EchoLight is an accessibility-focused app that:

  • Listens to environmental sounds while driving
  • Detects critical sounds such as sirens and fire alarms
  • Converts those sounds into clear visual alerts
  • Provides speech-to-text and translation support for spoken interactions
  • Operates hands-free to ensure safe use while driving

The goal is to increase awareness, confidence, and independence for drivers facing hearing or language barriers.

How we built it

  • Audio detection: Python using NumPy and SciPy for frequency-based sound pattern recognition (demo-focused, AI-ready architecture)
  • Backend: Java backend with a modular, scalable REST API (real-time via SSE)
  • Frontend: HTML, CSS, JavaScript
  • Version Control: GitHub

Challenges we ran into

Connecting the different parts of the project within a short timeframe, making the system reliable in real time, and keeping the interface intuitive and accessible for all users. Handling audio input was particularly tricky—laptop microphones often failed to detect alarms and sirens over background noise, which led us to use stereo mix for the demo to simulate how a phone mic would better pick up these important sounds.

Accomplishments that we're proud of

  • Teamwork & Integration: Pulled together audio detection, backend API, and frontend logic quickly, connecting all parts into a working end-to-end system under tight time constraints.
  • Accessibility & Hands-Free Design: Focused on creating a system that’s easy to use for drivers of all ages and abilities, with clear visual alerts and minimal distraction.
  • Real-World Impact: Built a solution that addresses language barriers and accessibility needs, helping drivers feel safer and more confident on the road.

What we learned

  • Learned how backends work in general—how you can send events to it and how it has ports to send events or collect statuses.
  • Figured out how to build and connect an API in a short time and why accessibility and ease-of-use are so important.
  • Saw firsthand how teamwork can bring all the pieces together quickly and make a complex project actually work.

What's next for EchoLight

  • Improve sound detection accuracy and transition from pattern recognition to AI-based sound type detection for more flexibility.
  • Add support for multiple languages and translation features, including road signs displayed on glasses.
  • Build a companion app for smartwatches to trigger vibrations for alerts.
  • Enable background operation and customizable alert settings for a more seamless, personalized experience.
  • Allow customization of visual cues triggered by different sounds to match user preferences.

Built With

Share this project:

Updates