70% of people abandon their exercise routines due to mental burnout. 30% suffer preventable injuries from poor technique. As an amateur athlete, I experienced both firsthand — and I started asking: can AI act as an intelligent guardian that combines physical coaching with neuroscience?
The concept of Brain Endurance Training (BET) — scientifically validated to raise brain frequency to the 48-52Hz gamma range before physical exertion — was the missing link. No fitness app had ever combined BET with real-time AI coaching. FitGuard AI is the answer.
🚀 What it does
FitGuard AI is the first multimodal AI fitness agent that integrates Brain Endurance Training, real-time biometric data from wearables, AI vision posture analysis, and live voice coaching — all in one platform.
⚡ Pre-Workout — Brain Activation A biometric dashboard pulls 8 real-time metrics from your wearables via Open Wearables API: HRV, resting heart rate, deep sleep, readiness score, steps, SpO2, stress score, and total sleep. Gemini analyzes both objective biometric data and your self-reported state — detecting discrepancies between how you feel and what your body actually shows. The BET Protocol then elevates brain frequency from 20-30Hz baseline to 48-52Hz optimal through 4 cognitive tasks: working memory inversion, mental arithmetic, visual pattern recognition, and neural breathing.
🏋️ During Workout — Real-Time Coaching Gemini Vision analyzes your posture from the camera, detecting biomechanical errors and injury risks. Between sets, BET micro-tasks maintain 50Hz brain frequency. The Gemini Live Voice Coach provides real-time spoken feedback via bidirectional WebSocket audio streaming.
🌙 Post-Workout — Holistic Recovery Recovery plans generated using wearable biometrics + session data — with HRV-adjusted sleep targets, nutrition, stretching protocols, and next-day BET calibration based on predicted recovery score.
🛠️ How I built it
- Frontend: Streamlit with custom dark athletic UI
- AI Core: Google Gemini 2.5 Flash for multimodal analysis
- Voice: Gemini Live API with native bidirectional audio streaming (WebSockets)
- Vision: Gemini multimodal for real-time posture analysis from camera
- Wearables: Open Wearables API — unifying Garmin, Polar, Suunto, Whoop, Apple Health into one normalized API
- Deploy: Google Cloud Run — serverless, auto-scaling
- BET Engine: Custom cognitive scoring system estimating brain Hz frequency in real time
⚠️ Challenges
The hardest challenge was integrating Gemini Live API with bidirectional real-time audio streaming. WebSocket latency and microphone/playback synchronization required multiple iterations to achieve natural conversation flow.
Designing the BET protocol meant translating neuroscience concepts (gamma waves, cognitive load, brain frequency) into measurable interactive tasks inside a web app — without losing scientific validity.
Integrating Open Wearables required building a unified data layer normalizing metrics across 5+ device ecosystems into a structure Gemini could reason over alongside subjective user input.
🏆 Accomplishments
- First fitness agent combining BET protocol with real-time AI coaching in one platform
- Bidirectional voice with Gemini Live API + simultaneous visual posture analysis
- Real-time brain frequency estimation (Hz) from cognitive task scoring
- 8-metric biometric dashboard integrating Open Wearables API across 5+ wearable ecosystems
- Full production deployment on Google Cloud Run
📚 What I learned
Gemini Live API is extraordinarily powerful for natural conversational experiences. The combination of real-time audio + video analysis opens unprecedented possibilities for personalized sports coaching. The Open Wearables ecosystem revealed how much value is unlocked when fragmented health data is normalized and handed to an AI agent that can reason across all dimensions simultaneously.
🔮 What's next
- Live Open Wearables connection with real wearable hardware
- Voice tone analysis to detect stress and fatigue from audio
- Cross-session memory for long-term athlete profiling
- HRV-adaptive BET difficulty — harder protocols on high-readiness days
- Native mobile app
🔗 Links
Built With
- artifact-registry
- gemini-live-api
- google-cloud-run
- google-gemini-2.5-flash
- opencv
- python
- streamlit
- websockets
Log in or sign up for Devpost to join the conversation.