Inspiration
We have flat feet. Both of us have extended family history with diabetes. Diabetic neuropathy destroys pressure sensation in the feet, and 130,000 Americans lose a foot every year because they can't feel the sustained pressure that causes ulcers. The clinical tools that detect these problems (plantar pressure mats like Tekscan F-Scan) cost $5,000 to $15,000. That's a device sitting in a specialist's office that you visit once a year if you're lucky.
We wanted something to stand on every day. Something that could tell us our arches are collapsing before our knees start hurting. Something that could vibrate our parents' insoles when they've been standing in one spot too long and can't feel it. Something that costs $50 instead of $15,000.
What it does
Pedisense is a pair of instrumented sandals with an iOS app and AI-powered analysis.
Hardware: 10 force-sensing resistors (5 per foot) capture plantar pressure distribution across five biomechanically critical zones: 1st metatarsal, 5th metatarsal, medial midfoot (the arch), medial heel, and lateral heel. Two haptic vibration motors provide tactile alerts. An ESP32 microcontroller streams sensor data over BLE at 15Hz through a 16-channel analog multiplexer.
App: The iOS app renders a real-time pressure heatmap using inverse-distance-weighted interpolation across the foot shape. A diagnostic scan captures 10 seconds of data, computes four biomechanical indices (arch index, pronation index, heel centering, forefoot balance), flags conditions like flat foot and overpronation, and sends the data to Gemini for clinical interpretation. An exercise mode provides real-time biofeedback scoring as you perform rehabilitation exercises. A pressure alert system monitors for dangerous sustained loading and buzzes the haptic motor on the affected foot, critical for neuropathy patients who can't feel pressure but can still feel vibration. All data syncs to Supabase with Google and email authentication, enabling longitudinal trend tracking and shareable clinical reports.
AI Agent: A FastAPI backend deployed on Railway runs the analysis pipeline. Raw sensor readings go in, Gemini 2.5 Flash returns a structured clinical interpretation with findings, exercise recommendations, and a podiatrist-ready report.
How we built it
Hardware: We used an ESP32-WROOM-32 dev board, a CD74HC4067 16-channel analog multiplexer (because the ESP32's ADC2 is disabled when BLE is active, leaving only 6 ADC1 pins), FSR 402 sensors with 10kΩ voltage dividers, and coin vibration motors driven by 2N2222 transistors. Everything is mounted on sandals with athletic tape and soldered jumper wire connections. The breadboard and battery bank sit in a pouch on a belt.
Firmware: Arduino C++ on the ESP32. BLE advertising with the service UUID in the advertisement packet (this took us a while to debug, iOS filters by UUID during scanning). Non-blocking motor control using millis() instead of delay() so the BLE stack doesn't get blocked. Channel mapping array to handle swapped heel sensors without rewiring.
iOS App: SwiftUI with CoreBluetooth for BLE, Swift Charts for trends, and background BLE support with state restoration for alerts that fire when the app isn't in the foreground. Supabase Swift SDK for auth and data persistence. Google Sign-In iOS SDK for OAuth. The heatmap renders in a Canvas view, interpolating 5 discrete sensor values into a continuous color field across a foot-shaped path defined with Bezier curves.
Backend: FastAPI on Railway. Gemini 2.5 Flash for clinical interpretation. The biomechanical analysis (arch index, pronation index, heel centering, forefoot balance) runs as pure Python before being sent to Gemini with clinical thresholds for context.
Data: Supabase PostgreSQL with four tables (calibrations, scans, reports, alerts), row-level security, and user-scoped queries tied to authenticated user IDs.
Challenges we ran into
BLE was the biggest time sink. The ESP32 was advertising but the iOS app couldn't find it because the service UUID wasn't included in the advertisement packet. nRF Connect could see it (scans for everything) but CoreBluetooth filters by UUID. Adding pAdvertising->addServiceUUID() fixed it, but it took us hours to figure out.
FSR 402 saturation. The sensors are rated 0-10kg. Each foot zone sees roughly 7kg in static standing, which is right in the usable range. But during heel strike in walking, load spikes to 15-20kg and the readings compress. For the demo this works because static diagnostics are the core feature.
Motor control blocking BLE. Our initial implementation used delay() in the BLE write callback to hold the motor on. This blocked the entire BLE stack, so only one motor could fire at a time. Switching to non-blocking millis() based timing in the main loop fixed both motors.
SwiftUI rendering at 50Hz. The ESP32 streams at 50Hz but pushing every update to the UI caused rate-limit warnings and frame drops. Throttling to 15Hz in the BLE callback eliminated the issue while keeping the heatmap smooth.
Sensor calibration. Different FSRs return different ADC values for the same pressure due to manufacturing variation. A 5-second calibration step where the user stands evenly captures per-sensor baselines, and all subsequent readings are normalized as percentages of that baseline.
Left and right feet swapped. Multiple times. In the firmware, in the heatmap view, in the sensor layout. Debugging which layer had the swap was tedious. Channel mapping arrays in firmware and careful testing with one sensor at a time resolved it.
Accomplishments that we're proud of
The moment we stood on the sandals and saw flat feet light up on the heatmap in real-time. The medial midfoot zone glowing red while a healthy foot would show nothing there. That's not a simulation. That's an actual collapsed arch, measured by sensors we wired ourselves, rendered on an app we built, interpreted by an AI that generates the exact exercises to fix it.
The haptic alert system working end-to-end: sustained pressure detected by FSR, transmitted over BLE, evaluated by the alert engine, motor command sent back over BLE, insole vibrates under the foot. The full bidirectional loop in under 100 milliseconds.
Building the entire system from bare ESP32 to deployed AI agent in 36 hours. Hardware, firmware, iOS app with seven functional tabs, FastAPI backend, Supabase database, Google auth, live sensor data, and real clinical analysis.
What we learned
BLE is not Bluetooth. It's a completely different protocol with different debugging tools and failure modes. Serial Monitor doesn't help when the problem is in the advertisement packet.
Hackathon hardware projects need testing checkpoints. Every phase had a checkpoint: does this one sensor read? Do all 10 read independently? Does the motor vibrate? Does BLE stream? Does the app receive? Never build the next layer on an untested foundation.
The FSR placement matters more than the FSR quality. A $3 sensor in exactly the right spot under the medial midfoot tells you more about flat feet than a $50 sensor in the wrong spot.
Splitting hardware and software across two people works, but the integration points (BLE UUIDs, data format, byte ordering) need to be agreed on early and tested constantly. Most of our bugs lived at the boundary between what each of us built.
What's next for Pedisense
Production insoles with embedded flex PCBs instead of breadboard wiring. FlexiForce A401 sensors rated to 445N for full gait analysis including running. Gait recording with animated replay. Multi-patient clinic mode for podiatrists. Integration with Apple Health. A proper clinical validation study comparing Pedisense readings against gold-standard Tekscan F-Scan data.
And giving a pair to our parents.
Log in or sign up for Devpost to join the conversation.