Inspiration
VitalLens was built around a simple idea: a wellness check-in should feel guided, accessible, and easy to understand without requiring extra hardware.
Most people already carry a phone with a camera, flash, motion sensors, and audio support, so I wanted to explore how those built-in sensors could create a lightweight wellness-only experience.
The goal was not to build a medical device, but to create a calm guided check-in that helps users capture a quick wellness snapshot using technology they already have.
What it does
VitalLens is a mobile-first guided wellness check-in app with two phone-based checks.
1. Finger-camera pulse estimate
The user covers the rear camera and flash with their finger, and the app estimates pulse from camera signal changes.
2. Breath motion check
The user places the phone on their chest or upper abdomen, and the app uses motion data to detect breathing-related movement.
After both checks, VitalLens creates a report with:
- pulse estimate
- breath motion result
- signal quality
- sample duration
- AI wellness summary
The report uses IBM watsonx to interpret structured pulse and breath data and generate a short wellness-focused summary centered on:
- session quality
- signal reliability
- breath consistency
- practical next steps
VitalLens is not a medical device. It does not diagnose, treat, or make medical decisions.
How I built it
I built VitalLens with Next.js, React, TypeScript, and deployed it on Vercel.
For the pulse check, I used:
- rear camera stream access
- torch / flash support
- finger-camera signal analysis
- local signal quality tracking
The app also measures sample duration and signal stability so the result is not based on a random instant reading.
For the breath check, I used the DeviceMotion API to capture small chest-related movements during a guided 30-second session.
Because users may not be looking at the screen during this process, I added voice guidance to improve usability.
The Report screen combines local Pulse and Breath results, prepares structured telemetry, and sends it to a secure server-side API route connected to IBM watsonx.
IBM watsonx integration
VitalLens uses IBM watsonx through a server-side API route.
The app sends structured wellness-only data including:
- pulse and breath results
- signal traces
- sample duration
- basic signal statistics
- breath motion consistency
Instead of simply repeating visible numbers, the watsonx summary interprets the quality of the session and generates a short understandable explanation of the results.
The AI summary is intentionally restricted to safe wellness language and avoids:
- diagnosis
- treatment recommendations
- clinical claims
- “normal” or “abnormal” medical wording
Challenges I ran into
The hardest challenge was making the experience reliable on mobile Safari.
Camera permissions, torch behavior, motion permissions, live video streams, and audio guidance all behave differently across mobile browsers and devices.
Another challenge was making the experience feel trustworthy. I wanted the breath waveform to reflect real motion data instead of decorative fake animation.
I also had to carefully handle IBM watsonx integration issues including:
- environment variables
- model availability
- malformed JSON responses
- fallback behavior when AI services fail
Accomplishments that I'm proud of
I’m proud that VitalLens combines multiple built-in phone capabilities into one guided experience:
- camera
- flash
- motion sensors
- voice guidance
- local signal processing
- IBM watsonx summaries
I also designed the app so local wellness results still work even if the AI summary becomes unavailable, making the experience more reliable and resilient.
What I learned
This project taught me a lot about mobile browser limitations, especially:
- iPhone Safari permissions
- camera stream handling
- torch support
- DeviceMotion access
- audio playback behavior
I also learned how important careful AI wording becomes when working with wellness-related data.
The app needs to remain useful and informative without crossing into medical claims.
What's next for VitalLens
Next, I plan to:
- improve downloadable report exports
- generate shareable summary images
- test across more devices
- improve signal quality handling
- refine IBM watsonx summaries using richer telemetry
The long-term goal is to continue exploring how everyday phone sensors can create more accessible wellness experiences.
Source code
GitHub repository:
https://github.com/arliking13/vitallens
The project repository is used as the primary source code reference because the full project archive exceeded direct upload limits.
Built With
- camera-api
- css
- devicemotion-api
- ibm-cloud
- ibm-watsonx
- mediadevices-api
- next.js
- react
- typescript
- vercel
- web-speech-api



Log in or sign up for Devpost to join the conversation.