Inspiration

When my grandmother had her hip surgery, she thought she would be on a walker for just a few weeks. Those weeks stretched and she didn’t improve, as her physical therapy and exercise levels were lowered and she lost motivation. My grandmother is still on that walker.

Almost ⅓ of Americans will require a walker for assistance at one point in their lives. According to the CDC, over 47,000 elderly individuals are hospitalized annually due to walker-related falls. Research indicates that people are seven times more likely to be injured in a fall while using a walker compared to using a cane. Therapists also commonly have no way to keep track of patient behavior beyond their short office visits. Emotionally, many lose the motivation to exercise, viewing their walker as a symbol of limitation rather than a tool for recovery.

We wanted to inject modern technology into this "old" device to help the elderly and those in rehabilitation regain their mobility safely, effectively, and encouragingly.

What it does

Our project is a real-time health-tech platform that turns a standard walker into a connected rehab companion. It combines gait analysis and fall detection by streaming computer-vision signals and motion metrics live to detect instability and suspected falls. Pressure sensing is integrated through FSR sensors in the walker, capturing left and right load so we can measure support dependence and asymmetry. An AI physical therapist, powered by a voice agent, provides live coaching, corrective prompts, and encouragement during sessions.

The system also includes an interactive web dashboard that shows live telemetry, trend history, and progress views for patients and caregivers. Clinicians can remotely review patient metrics and adjust exercise plans without waiting for in-person visits, enabling continuous oversight and faster intervention when needed.

How we built it

We built a multi-modal data pipeline that combines walker sensor data with computer vision-derived gait data. Vision events such as cadence, step variability, and fall suspicion are ingested continuously for live risk scoring. FSR readings from the walker grips and frame are normalized to estimate effort, load balance, and walker reliance.

On the backend, we used FastAPI, WebSockets, and SQLAlchemy to enable low-latency ingest and live broadcasting. The system merges sensor and vision streams into a unified resident state, triggers proactive alerts, and stores history for longitudinal analysis. We also integrated a voice pipeline that connects speech-to-text, reasoning, and then to a HeyGen avatar output to power the AI physical therapist in real time. We used OpenAI’s TPP, and logic model.

On the frontend, we built a React, TypeScript, and Vite application with distinct user experiences. The patient view features a high-contrast, low-friction interface focused on live feedback, goals, and motivation. The clinician view provides denser analytics for reviewing trends, gait stability changes, and pressure-distribution progress over time.

Challenges we ran into

The hardest part was synchronizing heterogeneous real-time streams. Pressure data and vision events arrive at different frequencies and had to be merged into one reliable timeline for both live coaching and clinician analytics. Keeping latency low while preserving enough historical context for “now versus recent baseline” decisions required careful WebSocket state design and query optimization. Also, developing our computer vision models and reliably detecting falls and steps was difficult.

What we learned

We learned that health-tech reliability is primarily about systems integration. Hardware signals, real-time transport, inference and agent logic, storage, and user experience all need to function cohesively as a single product. We gained practical experience designing resilient real-time ingest and broadcast pipelines for live care workflows. We also learned how to translate clinical rehabilitation concepts into interfaces that both patients and clinicians can understand and act on. Still, more than anything

What's next for SkyWalker

Next, we want to improve both engagement and prevention. For individuals, we plan to expand lightweight motivation features such as gamified goals, clearer progress summaries, and adaptive coaching tone. At the network level, we aim to add predictive models that flag subtle gait decline earlier across longitudinal data. We also plan to integrate haptic cues directly into the walker so users receive immediate posture and balance guidance without relying on a screen.

Built With

Share this project:

Updates