EyeQ AI

Inspiration

When did you last have your eyes checked? Not a glance in the mirror. A real exam. For most people, it's been years. For most people on this planet, it's never.

Vision problems are one of the most common and preventable sources of permanent disability in the world. And yet the system we have built to catch them requires a specialist, an appointment, insurance, transportation, and time. Take away any one of those, and the problem goes undetected. Take away all of them, and you have most of the world.

Here are the numbers. In the United States alone, roughly one-third of allergic conjunctivitis patients go undiagnosed and untreated. Up to two-thirds of adults with severe, vision-impairing cataracts never receive treatment. Nearly 80% of preschool children have never had a single eye examination. And for two-thirds of the children who are eventually found to have a vision problem, the screening that caught it was the very first eye exam they ever had.

Those are American numbers, from PubMed Central. In underserved communities, in rural areas, in developing nations where a specialist is not a 20-minute drive but a 3-day journey, these numbers are significantly worse.

And here is what makes this unforgivable. The most common childhood conditions we are talking about, amblyopia and strabismus, are completely, fully, 100% reversible if caught before age 7. After that window closes, the damage is permanent. We are not talking about a condition that is hard to treat. We are talking about a condition that is easy to treat and almost impossible to catch in time if you live in the wrong place or were born into the wrong circumstances.

Early detection shouldn't be a privilege. We built EyeQ.

What it does

EyeQ is an AI-powered eye health screening tool that uses your webcam to detect signs of common eye conditions in real time. Users simply look into their camera, and EyeQ analyzes the image to flag potential indicators of the following conditions:

  • Leukocoria (white pupil reflex)
  • Conjunctivitis
  • Uveitis
  • Cataract
  • Strabismus-style misalignment patterns
  • Amblyopia risk cues
  • Anisocoria (unequal pupil size)
  • Ptosis (drooping eyelid)
  • Macular / central field concerns
  • Fixation instability
  • Smooth pursuit abnormalities
  • Binocular asymmetry
  • Astigmatism
  • Distance and near acuity reduction
  • Color vision deficiency
  • Contrast sensitivity reduction

Within seconds, the app returns a confidence-scored assessment and recommends whether the user should seek medical attention, no clinic visit required for an initial screening!

How we built it

We built EyeQ as a full-stack AI application across three layers. On the frontend, we used Vite + JavaScript for a fast, responsive UI that handles webcam access and streams frames to the backend. We integrated MediaPipe for real-time eye detection and landmark tracking, automatically isolating the eye region before analysis. The backend is powered by FastAPI, which receives cropped eye images and runs inference through our custom Convolutional Neural Network (CNN), trained to classify multiple eye conditions from clinical and real-world image data. The CNN was built and trained in PyTorch/TensorFlow, then served efficiently through the FastAPI endpoint.

Challenges we ran into

  • Data scarcity: Labeled medical image datasets for eye conditions are limited and imbalanced. We had to heavily augment our training data and carefully tune class weights to avoid the model defaulting to the most common condition.
  • Webcam-to-clinical domain gap: Our model was trained on clinical photographs, but webcam images are noisier, lower resolution, and variably lit. Bridging this gap required preprocessing pipelines and fine-tuning on webcam-captured samples.
  • Precise eye isolation: Getting MediaPipe landmarks to reliably crop just the eye region — especially across different face angles, lighting, and eye sizes — took significant iteration.
  • Latency: Keeping inference fast enough to feel real-time while running a CNN on the backend required model optimization and async FastAPI handling.

Accomplishments that we're proud of

  • Built a working end-to-end medical AI pipeline — from live webcam feed to condition classification — in a single hackathon.
  • Achieved meaningful classification accuracy across multiple eye conditions using a custom-trained CNN, not just an off-the-shelf model.
  • Created a clean, intuitive UI that makes a complex AI system feel approachable and non-intimidating for everyday users.
  • Successfully bridged the gap between clinical-grade AI and consumer hardware.

What we learned

  • Medical AI requires a much higher bar for data quality and preprocessing than general computer vision tasks.
  • MediaPipe is incredibly powerful for real-time facial/eye landmark detection, but fine-grained ROI extraction needs careful calibration.
  • FastAPI is an excellent choice for ML inference APIs — its async support and automatic docs made iteration fast.
  • Responsible AI framing matters: how you present uncertainty and recommendations in a health context has real implications for user behavior.

What's next for EyeQ

  • Expanded condition library: Adding detection for cataracts, diabetic retinopathy, glaucoma indicators, and dry eye syndrome.
  • Longitudinal tracking: Letting users monitor changes in their eye health over time with trend visualizations.
  • Mobile app: Bringing EyeQ to iOS and Android using the device's front camera for wider accessibility.
  • Clinician dashboard: A companion tool where users can share their EyeQ report directly with their eye doctor before an appointment.
  • Regulatory pathway: Exploring FDA SaMD (Software as a Medical Device) guidelines to bring EyeQ toward clinical validation..

Built With

Share this project:

Updates