Inspiration

The driving force behind AEYRON Sentinel is deeply personal. One of our teammates watched their grandfather, battle Parkinson's disease. Witnessing the subtle, early signs go unnoticed until the disease had progressed made us realize that the current diagnostic pipeline is fundamentally flawed. By the time clinical symptoms are obvious, up to 80% of dopamine-producing neurons are already lost. We knew there had to be a way to catch the biomarkers hiding in plain sight, in a voice, a cough, a heartbeat, and a facial expression, without forcing patients to wear intrusive hardware.

How we built it

We engineered a real-time, contactless telemetry platform that fuses four independent biomarker streams into a unified PD risk score, updated every 2 seconds.

The core architecture relies on a FastAPI backend processing data from multiple endpoints:

  • Voice: A dual-path ML pipeline. We use an RBF-SVM trained on clinical features extracted via Praat and librosa (achieving 88.3% accuracy), alongside a personalized Vultr-hosted autoencoder that flags longitudinal drift by calculating the anomaly score based on the Mean Squared Error \(MSE\) of reconstructed acoustic features.
  • Cough: A rule-based hysteresis detection system extracting 8 acoustic features, calibrated against clinical literature to output a weighted PD likelihood score.
  • Face: In-browser hypomimia (masked face) detection using face-api.js, tracking the rolling variance of emotion confidence to flag abnormally flat expressions.
  • Heart: Utilizing remote photoplethysmography (rPPG), we dynamically calculate cardiac dysautonomia and stress.

Our base risk normalization is modeled as: $$ R_{vitals} = 0.35 \cdot R_{hr} + 0.35 \cdot R_{hrv} + 0.30 \cdot S $$ where \(R_{hr}\) is heart rate risk, \(R_{hrv}\) is heart rate variability risk, and \(S\) is the sympathetic overdrive stress score.

Challenges we ran into

We originally started building our vitals integration using the Presage SDK, but we quickly realized other teams were circling similar ideas. Instead of playing it safe, we decided to say "screw it" and go all out.

We also implemented Reactiv, and executed a massive pivot into custom hardware. As four CS students, we had absolutely no business building hardware on the fly. Bridging our software stack with raw radar sensors, custom microphones, and 3D printing components while navigating the absolute chaos of a multi-pivot hackathon deeply tested our sanity.

Accomplishments that we're proud of

We stepped completely out of our comfort zone and shipped things we had never touched before. We successfully built a full SwiftUI App and App Clip, integrated complex ML models into a live clinical dashboard, and engineered custom 3D-printed hardware with embedded radar and mic sensors. Beyond the technical stack, we are incredibly proud of the sheer patience and grit it took to stay locked in as a team and pull off a hardware-software fusion under extreme time constraints.

What we learned

We learned that building physical hardware is a completely different beast than writing Python scripts. We mastered the intricacies of Swift, the necessity of robust data pipelines when dealing with real-time sensor ingestion, and how to systematically debug integrated hardware-software systems. Most importantly, we learned the value of talking to real people and gathering live validation for our ideas.

What's next for Aeyron

Hack Canada was just the beginning. The validation we've received right here on the floor has been unreal, over 600 people resonated with the problem we are solving.

We've already started having incredible conversations with professors and mentors from UW, Velocity, GWU, and Refraction. Because of this overwhelming traction, we are going absolutely all-in over the next 30 days. We are transitioning AEYRON Sentinel from a hackathon project into a real, tangible product, and we are going to scale it to the max.

Built With

Share this project:

Updates