Inspiration
We were inspired by colorblind and aging drivers who face the most dangerous moments of every drive at intersections built around colors and signs they cannot always clearly see.
What it does
ClearPath is a mobile driving assistant that uses your phone camera and computer vision to watch the road in real time, announcing traffic light colors, reading road signs aloud, and alerting drivers when pedestrians or cyclists enter their path with triple‑redundant visual and audio cues.
How we built it
We fine‑tuned YOLOv8 on our own dashcam footage so the model understands our local intersections, paired it with OpenCV HSV analysis to classify traffic light colors, and layered in multilingual audio recorded from our community to reflect the cultures and languages we see around us.
Challenges we ran into
Our hardest challenge was limiting detection strictly to what we intended (like traffic lights and stop signs) and then pushing model confidence high enough that we could safely trigger an audible response without overwhelming or confusing the driver.
Accomplishments that we're proud of
We are proud that ClearPath can now reliably recognize stop signs and traffic lights and announce the correct next action in real time, turning ambiguous intersection moments into clear, confident decisions for our users.
What we learned
We learned how to manage hard deadlines, delegate tasks based on teammates’ strengths and weaknesses, and keep collaborating and iterating even when the models were failing early on.
What's next for ClearPath
Next, we want to add more languages, expand sign coverage to include yield signs and speed limits, and explore partnerships with car manufacturers so ClearPath can be built into vehicles as an accessible co‑pilot for every driver.
Log in or sign up for Devpost to join the conversation.