Inspiration
The inspiration for VitalFlow-Radar came from the most personal place possible—my family. In Thanksgiving of 2023, we lost my grandmother. She had been living independently, and though we checked on her regularly, we didn't have a way to monitor her health continuously. Looking back, I wonder if we could have caught the warning signs earlier if we had better tools. That loss taught me a painful lesson about the fragility of independent living and how much families need better ways to care for their loved ones from a distance. Then, this year, I became an uncle for the first time. Watching my brother and my sister-in-law struggle with the anxiety of monitoring her newborn by constantly checking if he was breathing, obsessively counting breaths per minute, and losing sleep over every little irregularity. I saw the same problem from a different angle. Whether you're 80 or 8 days old, continuous vital signs monitoring shouldn't require uncomfortable wearables or constant vigilance. These two experiences of grief and joy, loss and new life, crystallized a singular question: What if we could monitor someone's heart rate and breathing without touching them at all? I learned that 70% of seniors want to age at home, but their families live with constant anxiety about their health. Traditional vital signs monitoring requires patients to:
- Wear uncomfortable devices 24/7
- Remember to charge batteries daily
- Deal with skin irritation from adhesive sensors
- Actively participate in their own monitoring
This creates low compliance rates and missed health events—exactly when continuous monitoring matters most. For newborns, parents are told to use wearable socks or chest bands, but these create their own anxieties and false alarms.
At the same time, with my experience in Radar signal processing and machine learning, I wrote algorithms that leverage mmWave radar technology, the same 77GHz radar used in autonomous vehicles, can detect the tiny chest movements from heartbeats and breathing—from across the room. A heartbeat causes ~0.1mm chest displacement, which is perfectly detectable by radar waves. When I realized this was possible, I knew I had to build it for my grandmother's memory, for my nephew's future, and for every family navigating the same fears.
What it does
VitalFlow-Radar is a cloud-native contactless vital signs monitoring system that extracts heart rate and breathing rate from radar signals without any wearables or physical contact.
Core Capabilities
Heart Rate Detection: Measures cardiac rhythm (48-150 BPM) by detecting micro-chest movements of ~0.1mm amplitude
Breathing Rate Detection: Tracks respiratory patterns (6-30 breaths/min) from chest wall displacement
AI-Powered Anomaly Detection: Identifies critical conditions like:
- Bradycardia (HR < 60 BPM)
- Tachycardia (HR > 100 BPM)
- Apnea (breathing pauses)
- Tachypnea (rapid breathing > 25 BPM)
- Real-Time Dashboard: Live visualization with WebSocket streaming and health trend analysis
- Gemini AI Health Summaries: Natural language health insights generated by Google's Gemini 2.5 Flash
The Math Behind It
The radar transmits FMCW (Frequency Modulated Continuous Wave) signals. When these waves reflect off a person's chest, the phase shift \(\phi[k]\) encodes the displacement: $$\phi[k] = \frac{4\pi d[k]}{\lambda}$$ Where \(d[k]\) is the chest displacement and \(\lambda\) is the wavelength (~4mm at 77GHz). A heartbeat causes ~0.1mm displacement, which produces a measurable phase change of ~0.3 radians.
How we built it
Architecture Overview
We transformed a monolithic MATLAB research pipeline into a distributed, production-ready Python system:
Edge Device → Confluent Kafka → FastAPI Backend → React Dashboard
↓ ↓ ↓
AWR1642 Cloud Streaming Vertex AI
Radar (Gemini 2.5)
Edge Processing
Hardware: TI AWR1642 77GHz FMCW Radar + Raspberry Pi Signal Processing Pipeline:
- Range FFT: Convert time-domain ADC samples to range bins
- MTI Filter: Moving Target Indication with \(\alpha = 0.01\) to remove static clutter
- Variance-Based Bin Selection: Automatically find the "chest bin" at 0.5-1.5m
- Phase Extraction: Unwrap and preprocess the phase signal
Why Edge Processing?: By performing DSP at the edge, we reduce bandwidth requirements from 2 Mbps to ~10 Kbps, which is critical for scaling to hundreds of devices.
Cloud Streaming (Confluent Cloud)
Topics:
vitalflow-radar-phase(10Hz raw data),vitalflow-vital-signs(3s aggregates)Schema Registry: Enforces data contracts across producers/consumers
Why Kafka?: Handles multiple radar devices streaming simultaneously, provides replay capability for debugging, and enables decoupled consumers
Backend API (FastAPI)
# Async Kafka consumer + WebSocket broadcast
async def consume_kafka():
async for msg in consumer:
vital_signs = json.loads(msg.value)
await broadcast_to_websockets(vital_signs)
- Kafka Consumer: Subscribes to vital signs stream with auto-commit
- WebSocket Broadcast: Real-time updates to connected dashboards
- Vertex AI Integration: Triggers AI analysis on anomalies
- REST API: Health checks, alerts history, patient management
DSP Algorithms
STFT Ridge Tracking for robust frequency estimation:
- Heart Rate: 8-second Hanning windows, 90% overlap, 4096-point FFT
- Breathing Rate: 15-second windows with continuity constraints
- Ridge Tracking: Connect spectral peaks across time using ±8 BPM continuity guard
MODWT Wavelet Transform for signal separation:
- Maximal Overlap Discrete Wavelet Transform using Daubechies wavelets
- Heart Band: 0.9-2.3 Hz (54-138 BPM)
- Breath Band: 0.2-0.7 Hz (12-42 breaths/min)
Harmonic Rejection: Critical innovation to prevent breathing harmonics from corrupting heart rate:
# Reject peaks near 2nd, 3rd, 4th breathing harmonics
harmonic_freqs = [br_hz * n for n in range(2, 5)]
if any(abs(hr_candidate - hf) < 0.08 for hf in harmonic_freqs):
reject_candidate()
AI Analysis (Vertex AI + Gemini)
# Anomaly detection triggers Gemini analysis
if anomaly.severity == "critical":
prompt = f"""
As a pediatric cardiologist, interpret these vital signs:
- Heart Rate: {hr} BPM (confidence: {hr_conf})
- Breathing Rate: {br} BPM (confidence: {br_conf})
- Detected Anomaly: {anomaly.type}
Provide: 1) Clinical assessment, 2) Possible causes, 3) Recommendations
"""
summary = gemini_model.generate_content(prompt)
Fallback Logic: When Vertex AI is unavailable, rule-based summaries ensure the system never fails silently.
Frontend (React + TypeScript + Vite)
Real-time Charts: Recharts with 60-second rolling window
TailwindCSS: Responsive design for mobile/desktop/tablet
WebSocket Connection: Automatic reconnection with exponential backoff
Context API: Global state for auth and vital signs data
Dark Mode Ready: Designed for 24/7 monitoring dashboards
Tech Stack: React 18 + TypeScript + Vite + TailwindCSS + Recharts
Challenges we ran into
Challenge 1: Breathing Harmonics Corrupting Heart Rate
Problem: The 2nd harmonic of breathing (~0.5 Hz × 2 = 1.0 Hz) falls directly in the heart rate band (0.8-2.5 Hz), causing false cardiac readings. During testing, we were getting "heart rates" of exactly 30 BPM—which was just 2× the breathing rate of 15 BPM.
Solution: We implemented Harmonic Shielding—a guard band that rejects peaks within ±5 BPM of known breathing harmonics:
harmonic_freqs = [br_hz * n for n in range(2, 5)] # 2nd, 3rd, 4th harmonics
for hf in harmonic_freqs:
if abs(candidate_peak - hf) < guard_band:
reject_peak()
This reduced false heart rate detections by 87% in our testing.
Challenge 2: Motion Artifacts
Problem: Any body movement (adjusting position, scratching, coughing) creates massive phase disturbances—up to 1000× larger than the vital signs signal. These artifacts would completely swamp our measurements.
Solution: We compute movement_energy from high-pass filtered phase variations and gate the output:
$$E_{motion} = \sum_{k=1}^{N} |\phi_{HP}[k]|^2$$
Where \(\phi_{HP}\) is the phase signal after 1Hz highpass filtering. When \(E_{motion} > 8.5 \times 10^5\), we flag is_motion_artifact = true and suppress that window's estimates. This threshold was empirically determined through testing.
Challenge 3: Automatic Range Bin Selection
Problem: The subject could be anywhere from 0.3m to 1.5m from the radar. With 256 range bins, we needed to automatically find which bin contains the person—without requiring manual calibration.
Solution: Variance-maximization across valid range bins:
$$bin_{selected} = \arg\max_{i \in [d_{min}, d_{max}]} \text{Var}(|S_i[k]|)$$ The bin with maximum amplitude variance over time corresponds to the breathing/heartbeat modulation. We recompute this every 50 frames (~5 seconds) to adapt to subject movement.
Challenge 4: MATLAB to Python Migration
Problem: The original research algorithm (vital_sign_ahmed.m) was 1000+ lines of MATLAB with complex wavelet processing using MATLAB-specific functions. Direct translation wasn't possible.
Solution: Careful porting with validation against known datasets:
- Used
pywtfor MODWT (Maximal Overlap DWT) - Used
scipy.signalfor Butterworth filters and STFT - Created test cases with synthetic signals (known HR/BR)
- Validated each stage output against MATLAB reference The final Python implementation is actually 20% faster than MATLAB due to NumPy's optimized array operations.
Challenge 5: WebSocket Connection Stability
Problem: During development, frontend WebSocket connections would drop randomly, especially when backend restarted or during Kafka rebalancing. Solution: Implemented exponential backoff reconnection logic in the frontend:
const reconnect = () => {
setTimeout(() => {
setRetryCount(prev => prev + 1);
connectWebSocket();
}, Math.min(1000 * Math.pow(2, retryCount), 30000));
};
Combined with connection state indicators in the UI, users now have full visibility into connection status.
Accomplishments that we're proud of
Real Working System: We have a complete end-to-end system from radar hardware to web dashboard that runs 24/7
Research-Grade Algorithms: Implemented peer-reviewed signal processing techniques (MODWT, STFT ridge tracking, harmonic rejection) from IEEE papers
Production Architecture: Scalable Kafka streaming, async FastAPI, real-time WebSockets—ready for multiple simultaneous devices across hospitals or homes
AI Integration: Gemini 2.5 Flash provides intelligent health summaries that translate clinical data into human-understandable insights
Open Source: The entire codebase is available on GitHub for the community to build upon—we believe healthcare technology should be accessible
Pediatric Optimization: Specifically tuned for higher heart rates (up to 150 BPM) and breathing patterns common in children and infants
Accuracy: Achieved ±3 BPM heart rate accuracy and ±2 BPM breathing rate accuracy in controlled testing against reference pulse oximeters
Edge Performance: Runs on Raspberry Pi 4 with <50% CPU usage, proving this can scale to low-cost deployments
Comprehensive Documentation: Over 2,000 lines of documentation covering architecture, algorithms, deployment, and business model
What we learned
Technical Lessons
Radar Signal Processing is Hard: The raw phase signal is incredibly noisy—typical SNR is only 5-10 dB. We learned that careful filtering and windowing choices make or break the accuracy. A poorly chosen window length in STFT can destroy frequency resolution.
Harmonics are Everywhere: Human physiology creates complex harmonic relationships. The 2nd breathing harmonic at ~30 BPM × 2 = 60 BPM is exactly where resting heart rate lives! We spent days debugging what we thought was a sensor issue before realizing it was a fundamental signal processing challenge.
Kafka for Streaming: Confluent Cloud made it incredibly easy to set up production-grade streaming. The schema registry and exactly-once semantics are game-changers. What would have taken weeks to build from scratch took hours with Confluent.
Vertex AI Simplicity: Integrating Gemini was surprisingly straightforward. A few lines of code and we had AI-powered health insights. The hardest part was crafting effective prompts—we iterated through dozens before finding the right "pediatric cardiologist" persona.
Async Python is Powerful: FastAPI's async capabilities allowed us to handle WebSocket connections and Kafka consumption concurrently without blocking. This was crucial for real-time performance.
Domain Lessons
Healthcare is Complex: Normal ranges vary by age, activity level, and individual. A "normal" heart rate for an infant (120+ BPM) would be dangerous for an adult. We had to implement age-aware thresholds.
Contactless ≠ Effortless: Even without wearables, there are constraints—the subject needs to be relatively still and within range. We learned to communicate these limitations clearly rather than overpromising.
Privacy Matters in Healthcare: Using radar instead of cameras is a huge privacy advantage. People are uncomfortable with bedroom cameras, but radar only sees motion—no images, no facial recognition.
Team Lessons
Documentation Matters: Building comprehensive docs from day one helped us move faster as the project grew. Future contributors (and our future selves) will thank us.
Start with the Hard Part: We tackled the DSP algorithms first because that was the highest-risk component. If the signal processing didn't work, nothing else mattered.
Personal Motivation is Powerful: Building something with personal meaning (my grandmother, my nephew) kept me going through the toughest debugging sessions at 2 AM.
What's next for VitalFlow-Radar
Short Term (3-6 months)
Multi-Person Tracking: Extend algorithms to detect and separate multiple people in the radar's field of view using beamforming with the 4 RX antennas. This would enable monitoring shared bedrooms or hospital wards.
Sleep Stage Detection: Add sleep quality metrics by analyzing breathing pattern variability and movement. HRV (Heart Rate Variability) analysis can distinguish REM from deep sleep.
Mobile App: React Native companion app for family members to receive push notifications and view real-time data remotely.
Historical Analytics: Add trend analysis—"Your average resting heart rate has increased 8% over the past week."
Medium Term (6-12 months)
Clinical Validation: Partner with hospitals to conduct formal accuracy studies against reference ECG/respiratory monitors with 100+ subjects.
Aging-in-Place Platform: A complete home health monitoring ecosystem combining VitalFlow-Radar with activity recognition, medication adherence tracking, and emergency response.
Hospital Integration: HL7/FHIR-compliant EMR integration for seamless room monitoring. Reduce alarm fatigue by filtering false positives with AI before alerting nurses.
OEM Licensing: Provide the core technology to smart home manufacturers (Google Nest, Amazon Echo) and automotive companies (driver drowsiness detection).
Global Health: Partner with NGOs to bring contactless vital signs monitoring to resource-limited settings where wearable monitors are too expensive or impractical.
Research Platform: Offer anonymized datasets to academic researchers studying cardiovascular and respiratory patterns across populations.
Built With
- confluent
- fastapi
- google-cloud
- kafka
- llm
- python
- radar
- react
- typescript

Log in or sign up for Devpost to join the conversation.