Inspiration

EyeQ was inspired by a personal experience. We noticed how both of our parents often struggled to find the right glasses prescription — frequently trying different strengths without being fully sure what worked best. This made us realize how difficult it is for many people, especially busy adults and students, to understand their vision without immediate access to professional care. We wanted to create a simple and accessible tool that helps users get an early understanding of their vision directly from their device.

What it does

EyeQ helps users estimate their vision needs by combining real-time distance tracking with a guided vision test. Using a credit card as a reference, the system ensures the user is positioned at approximately 40cm from the screen. Once the correct distance is reached, the user answers a series of questions, and the system provides an approximate prescription range in diopters, helping users better understand their vision.

How we built it

We built EyeQ using Python, with Streamlit for the interface, OpenCV for computer vision, and NumPy for mathematical calculations. The system detects a credit card through the webcam and uses HSV color space for more reliable detection. Hue identifies the card’s color, Saturation filters out dull or grey areas, and Value helps distinguish brightness, making detection stable even under different lighting conditions. Once the card is detected, the system calculates distance using geometric relationships and then runs a vision test based on a binary search algorithm to estimate prescription values.

Challenges we ran into

One of the biggest challenges was making the card detection accurate in real-world environments. Initially, the model struggled to differentiate the card from other objects with similar colors, especially when lighting conditions changed. Using only Hue and Saturation was not enough. After introducing Value, which represents brightness, the system was able to distinguish between objects with similar colors but different intensity levels. This significantly improved detection accuracy and allowed the system to work much more smoothly and reliably.

Accomplishments that we're proud of

We are proud that we were able to build a fully working system that combines computer vision and mathematical modeling into a practical tool. The ability to estimate vision needs using just a webcam and a credit card makes the solution accessible and easy to use. We also successfully implemented a structured algorithm that narrows down prescription values efficiently, which shows the strength of combining simple ideas with strong technical execution.

What we learned

Through this project, we learned a lot about computer vision, especially how lighting and noise affect detection in real-world conditions. We gained a deeper understanding of why HSV color space is more reliable than RGB and how combining Hue, Saturation, and Value improves accuracy. We also learned how mathematical models can be applied to real-world problems, using NumPy to process data and represent prescription values in diopters. Beyond the technical side, we also learned more about eye health, vision problems, and how prescriptions are determined.

What's next for EYEQ

Going forward, we want to improve the accuracy of prescription estimation and expand the system with more guided vision tests. We also plan to develop EyeQ into a full application for both Windows and macOS. Our goal is to make eye care more accessible by providing users with a simple first step toward understanding their vision, especially when professional care isn’t immediately available.

Built With

Share this project:

Updates