Inspiration

I am a software engineering student from Lesotho , a small landlocked country in Africa where rural clinics are understaffed, specialists are hours away, and community health workers make life-and-death decisions with paper rulers and basic photographs.

Two crises moved me deeply enough to build something real.

First — chronic wounds. Diabetic ulcers, pressure sores, and burns affecting millions of rural Africans are being mismanaged because health workers have no structured way to assess severity, track healing over time, or know when to refer urgently. Thousands of preventable amputations happen every year not because the wounds were untreatable but because they were never properly assessed.

Second — cervical cancer kills over 70,000 African women every year. Not because it cannot be treated. Because it is caught too late. Rural women are never screened. Community health workers visit them regularly but have no AI-powered decision support to flag high risk cases for urgent referral.

I did not read about these problems in a research paper. I come from here. That is the foundation this project is built on.

What it does

Vantage3D × CervixAI is a fully working clinical AI platform that gives authorized frontline health workers structured triage, longitudinal patient records, and accountable documentation, built to work where connectivity and specialists are scarce.

Wound Analysis with Multi-Angle Intelligence A health worker uploads 1 to 6 wound photos from different angles. Azure Computer Vision Image Analysis 4.0 processes all images simultaneously and returns a multi-image aggregated result including infection risk percentage, severity score out of 10, tissue breakdown across four categories — granulation, necrotic, slough, and epithelial — a clear recommendation badge (Monitor / Refer / Refer Urgently), and a "Why this score?" explainability section showing the top Azure visual feature tags and their confidence strengths. Image quality is assessed in real time — low resolution warnings, blur detection, and variation across angles are flagged before saving.

CervixAI Screening A health worker uploads a VIA image. Azure AI returns a risk level — Low, Moderate, or High — with confidence percentage, clinical finding, recommended action, and an auto-generated referral document that downloads immediately for high risk cases.

Patient Treatment Book Every patient has a complete longitudinal record. Each session is saved chronologically with the uploaded images, AI findings, tissue breakdown, recommendation, and a follow-up comparison against the previous visit — "Infection risk: 65% → 35% (Improving)" or "55% → 55% (Stable)." A nurse can scroll through a patient's entire healing journey from first visit to latest session in one place.

Healing Heatmap Visual timeline showing wound healing progress across sessions — bars going from red to green as infection risk decreases over time.

Clinical Governance Built In Staff sign-in with name and ID stored per device. Pre-save checklists confirming patient identity, consent, and image quality acknowledgment. Full audit trail of who saved what and when. Two-tier workflow allowing community health workers to submit sessions for supervisor review and approval. A dedicated Judge Brief page explaining the app's safety posture, AI limitations, and non-diagnostic framing.

Offline First Architecture The entire app runs in a browser with no server required. All patient data saves to localStorage on the device — no cloud, no internet needed for records, heatmap, or patient history. Azure AI analysis queues automatically when offline and retries when connectivity returns. A queue counter in the footer shows pending analyses at all times.

Bilingual Support Key safety guidance and clinical language displays in both English and Sesotho — the national language of Lesotho — making the tool accessible to health workers more comfortable in their mother tongue.

Alerts and Review System Refer and urgent cases automatically appear in the Alerts panel with priority colour coding. A separate Review queue allows supervisors to approve submitted sessions before they are marked final.

How i built it

Vantage3D is a static single-page web application — no server, no installation, no database setup. A health worker opens one HTML file in Chrome and the tool works immediately on any laptop.

Tech stack:

  • Frontend: HTML, CSS, JavaScript — single page application with tab navigation
  • AI: Microsoft Azure Computer Vision Image Analysis 4.0 — images sent as binary ArrayBuffer streams, region-aware feature selection
  • Persistence: Browser localStorage with schema versioning, device ID, and automatic data migration across versions
  • Offline: Progressive Web App with service worker caching the app shell; analysis queue in localStorage with manual and automatic retry
  • No backend by design: a clinic in rural Lesotho cannot rely on a server. Patient data never leaves the device.

Key architectural decision: i chose localStorage over any cloud database deliberately. Privacy, offline reliability, and zero infrastructure cost for rural clinics were more important than convenience. The sync scaffold for future optional cloud backup exists but the core tool works without it permanently.

Challenges i ran into

Azure region limitations. South Africa North does not support the Caption feature of Image Analysis 4.0. i built automatic retry logic that detects the 400 error on the first request, removes caption from the feature list, and retries with tags and objects — returning a 200 response with useful clinical data. This happens transparently with no disruption to the user.

Binary image upload. Azure Image Analysis 4.0 stream API requires raw binary data as octet-stream. Getting a browser File object correctly converted to ArrayBuffer without base64 encoding errors required careful debugging of the fetch pipeline.

Multi-image aggregation. Combining results from up to 6 different wound angle photos into one meaningful clinical summary required building a scoring model that aggregates Azure tag confidence scores across images and produces a single weighted infection risk and tissue breakdown.

Clinical credibility versus technical excitement. The temptation in AI health projects is to overclaim. i deliberately framed every result as AI-assisted decision support not diagnosis. Every result panel says "not a diagnosis" and escalation language is built into every recommendation. Building governance features that earn trust from medical professionals mattered more than impressing judges with accuracy numbers.

Building for the actual user. A nurse in a rural clinic is not a developer. Every technical term — Azure, API, localStorage, confidence band — had to be hidden or translated into plain clinical language. The interface had to work for someone whose only goal is to upload a photo and know what to do next.

Accomplishments that i'm proud of

The treatment book. A complete longitudinal patient record with images, findings, follow-up comparisons, and improvement percentages across sessions. A nurse can show a patient their own healing journey visually. That feature does not exist in any rural African wound care tool available today.

Real multi-angle wound analysis. Uploading 4 photos and getting a multi-image aggregated clinical result with per-image quality warnings, confidence bands, and explainability — built on top of a general purpose vision API with no custom medical model training.

Offline queue. Failed analyses when internet drops do not disappear. They queue, show a counter in the footer, and retry automatically or manually when connectivity returns. In a rural clinic this is the difference between data loss and data safety.

Bilingual clinical copy. Seeing clinical safety guidance appear in Sesotho in a software tool built by someone from Lesotho felt like the most personally meaningful technical achievement in the entire project.

Shipping a working product. Not a mockup. Not a demo that only works under perfect conditions. A real app that analyses real wound images from multiple angles, saves real patient records with images, generates real referral documents, tracks healing over time, and works offline. Built during a university break by one student.

What i have learned

The hardest engineering problem is not the code. It is understanding the user so deeply that you make the right decisions about what to build and what to leave out.

Building with AI assistance is not cheating. It is the future of software engineering. The skill is not memorising syntax — it is knowing what problem to solve, how to direct tools intelligently, how to read and debug output, and how to make decisions about architecture that serve real users.

Error codes are your friends. 410 Gone told us the API endpoint was deprecated. 400 Bad Request told us the feature was unsupported in our region. Reading those responses and understanding what they meant turned frustrating failures into fast fixes.

Clinical governance is not optional in health technology. Trust from medical professionals comes from audit trails, checklists, non-diagnostic framing, and honest confidence communication — not from impressive accuracy numbers alone.

Being from a small overlooked country is not a disadvantage. It is the deepest form of user research. i did not need to interview anyone about rural clinic conditions. We come from here.

What's next for Vantage3D(The bright Vintage3D future)

Phase 2 — JavaFX Desktop Application Rebuild Vantage3D as an installed desktop application using JavaFX. Integrate live Logitech C920 webcam capture directly in the app for standardized clinical image capture — consistent framing, consistent distance, consistent lighting across every session for every patient.

Phase 3 — Real 3D Wound Reconstruction Implement Structure from Motion using BoofCV. A nurse orbits the C920 around a wound for 10 seconds. The engine reconstructs the wound in three dimensions and calculates exact volume in cubic centimetres, maximum depth in millimetres, and surface area — measurements currently requiring a $20,000 medical laser scanner. This is the breakthrough that makes Vantage3D genuinely irreplaceable in rural healthcare.

Phase 4 — Full Sesotho, Zulu, and Xhosa Voice Instructions Complete voice guidance in Southern African languages so community health workers can use the tool entirely in their mother tongue.

Phase 5 — Clinical Validation Partner with physiotherapists and clinics in Maseru for real-world validation. Collect clinical outcome data. Pursue medical device certification pathway.

The mission stays the same. Give every rural clinic the eyes of a specialist , for the cost of a laptop and an internet connection.

Built in Lesotho. Built for the World.

Built With

Share this project:

Updates