Dr. Mole — AI-Powered Skin Health Companion

Inspiration

Skin cancer is one of the the most common cancers in the United States, yet it is also one of the most preventable and treatable when caught early. The statistics are sobering: survival rates are approximately 99% when caught early, dropping to around 27% when detected late.

That gap — nearly 72 percentage points — is not a medical mystery; it is an access and awareness problem. Most people do not know what a dangerous mole looks like, do not have easy access to a dermatologist, and do not have a consistent way to track changes in their skin over time.

Dr. Mole was inspired by a simple question: what if your phone could be the first line of defense? Not as a replacement for a doctor, but as a tool that lowers the barrier to noticing something wrong and taking action before it becomes a crisis.


What it does

Dr. Mole is a mobile-first skin health companion built with React Native and Expo. It provides users with:

  • Dual AI-powered lesion analysis: Photograph a mole and receive an instant risk assessment through two independent AI pipelines — a large vision-language model evaluating the clinical ABCDE criteria (Asymmetry, Border, Color, Diameter, Evolution), and a dedicated melanoma classification model that produces a per-class probability distribution.
  • Personalized risk profile: A five-question onboarding quiz covering skin type, family history, sun exposure, sunscreen use, and sunburn history produces a scored risk profile (Low / Moderate / High) with tailored recommendations.
  • Interactive body map: Pin and track mole locations on an anatomical body silhouette. Coordinates are stored in a resolution-independent normalized form so they remain accurate across device sizes and orientations.
  • Scan history with comparison: Browse all scans in a grid or grouped by mole. Select any two scans to run a side-by-side comparison that highlights changes over time. Batch-delete individual scans or entire moles from history.
  • Live UV index tracking: The home screen pulls real-time UV index data based on the device's location and surfaces exposure risk to help users make informed decisions about sun protection before they go outside.
  • Doctor summary reports: Generate a formatted clinical summary for any mole — including ABCDE observations, urgency rating, and scan timeline — ready to share with a dermatologist.
  • Dermatologist hub: Access a clinic directory with filtering by insurance and telehealth availability, a symptom triage wizard, a skin type quiz, appointment request forms, and a doctor notes log.
  • Cost transparency tools: A cost estimator with insurance-aware breakdowns helps users understand out-of-pocket expectations before booking.
  • Professional resources: Curated links to the AAD locator, ZocDoc, and the Skin Cancer Foundation.

How we built it

The technical stack was chosen for development speed and cross-platform reach:

Layer Technology
Framework React Native + Expo SDK 54
Navigation React Navigation (Stack + Material Top Tabs)
Persistence AsyncStorage
UI Custom design system (dark theme, teal accent)
ABCDE Vision Analysis Groq API — meta-llama/llama-4-scout-17b-16e-instruct
ML Skin Classification FastAPI + PyTorch + HuggingFace (SeyedAli/Melanoma-Classification)
UV Exposure Tracking Open-Meteo API (location-aware, no API key required)

Dual AI pipeline. Each scan triggers two independent requests in parallel. The Groq vision model receives the image as a base64-encoded JPEG alongside a structured prompt that enforces strict JSON output covering all five ABCDE fields, an urgency rating, and a plain-language summary. Simultaneously, the image is forwarded to a local FastAPI server that runs a fine-tuned HuggingFace image classification model. The backend applies softmax to the model logits, sanitizes non-finite values, and returns a full per-class probability distribution alongside the top prediction. Both results are stored together in the scan record and surfaced independently on the results screen.

Body map coordinate normalization. Pin placement uses a coordinate normalization strategy to remain resolution-independent. Pin coordinates are stored as fractional values relative to the image container, then projected back to pixel coordinates at render time. Image dimensions are computed once at module load from the device screen size and the image aspect ratio, ensuring pins remain accurate across all device sizes and orientations.


Challenges we ran into

  1. Having enough time to train a large dataset to improve its accuracy (the more epocs/rounds we added the longer it took, up to 4-5 hours) use fewer photos & epocs → faster but less accurate use more photos & epocs → a little more accurate but time consuming
  2. We originally tried training our own model but because of how long it took to get feedback and make edits we decided to go along with an online model.
  3. Implementing the backend with fastAPI.

What's next for Dr. Mole

  • PDF export: Generate a doctor-ready PDF report of a mole's full scan history, ABCDE observations, and urgency timeline — formatted for clinical handoff rather than just screen sharing.
  • Monitoring reminders: Scheduled push notifications that remind users to re-photograph a mole after four to six weeks, closing the loop between scans and keeping the history meaningful.
  • Secure cloud backend: Move from on-device AsyncStorage to a HIPAA-aligned cloud backend so scan history persists across devices and can be shared directly with a care provider's patient portal.
  • On-device ML inference: Replace the FastAPI server with a CoreML / TFLite converted version of the classification model so the full pipeline runs entirely on the device — no local server required.

Disclaimer: Dr. Mole is intended for educational tracking and screening assistance only. It is not a substitute for professional medical advice, diagnosis, or treatment. Always consult a licensed dermatologist for evaluation of any skin concern.

Built With

Share this project:

Updates