Inspiration

AidLens Universal was inspired by seeing how many people, especially in rural or low-income areas, lack access to early healthcare. Simple issues like skin infections, eye irritation, or dehydration often go unnoticed until they become serious. With smartphones now widely available and equipped with high-quality cameras, it seemed possible to create a tool that empowers anyone to detect health issues early, even without visiting a clinic.

What it does

AidLens Universal turns a smartphone into a personal health companion. Users can take photos of their skin, eyes, nails, or wounds, and the app:

Detects possible early-stage conditions

Provides first-aid guidance

Tracks progress over time

Works offline for communities with limited internet access

Suggests nearby affordable healthcare facilities if needed

How we built it

Dataset research: Collected publicly available images for skin, eye, nail, and wound conditions.

AI model development: Used lightweight CNNs (MobileNet) optimized for mobile and offline use.

App workflow design: Simple flow: capture → analyze → results → guidance → track progress.

Offline implementation: On-device AI processing ensures usability without internet.

Guidance content: Collaborated with medical advisors to create clear, actionable instructions.

Challenges we ran into

Limited datasets for certain conditions, requiring data augmentation and careful preprocessing.

Balancing model accuracy with device performance, as large models were too slow for offline use.

Designing user-friendly instructions for people with different literacy levels and languages.

Ensuring privacy, by keeping all processing on-device.

Accomplishments that we're proud of

Built a functional prototype capable of detecting multiple conditions.

Integrated offline AI processing for real-world use.

Developed culturally accessible first-aid guidance.

Created a visual health log to track progress over time.

What we learned

Mobile AI can be powerful and practical for real-world healthcare problems.

Designing for underserved communities requires simplicity, clarity, and offline capability.

Balancing technical feasibility, accuracy, and performance is essential for mobile apps.

Collaboration between AI, mobile development, and medical expertise is critical for impact.

What's next for AidLens Universal

Expand datasets to improve detection accuracy for more conditions.

Integrate multilingual support for global accessibility.

Add preventive care tips and reminders for daily health checks.

Explore partnerships with clinics and NGOs to reach more underserved communities.

Conduct pilot testing in rural areas to refine usability and effectiveness.

Built With

Share this project:

Updates