Inspiration
We ride bikes every day at UC Davis — it's basically the only way to get around campus. And every day we hit the same potholes, the same cracked stretches near the MU, the same gravel patches that send your wheel sideways. The roads don't get fixed because there's no feedback loop. You hit a pothole, you curse, you move on. Nothing gets reported.
We wanted to build something that closes that loop automatically — a device that notices when your ride gets rough, figures out what happened, and puts it on a shared map so the next cyclist doesn't get surprised. Essentially: what if your bike could report hazards the same way Waze reports traffic?
The harder version of that question is: how do you tell the difference between a pothole and someone bumping your handlebars? That's what pushed us toward using AI instead of just thresholds.
What it does
Baze mounts on any bicycle and watches two streams of data simultaneously: an accelerometer tracking road vibration at 100Hz, and a forward-facing camera capturing what's in front of you.
When the vibration spikes above normal riding levels, Baze captures a 5-second window of IMU data plus a camera snapshot and sends both to Gemini 2.5 Flash. Gemini reads the full vibration curve — not just a single number — alongside the image, and decides: is this a real road hazard (RED or ORANGE), or just noise (GREEN)?
GREEN events are silently discarded. Only genuine hazards make it to the map.
The live dashboard shows hazard zones across campus as translucent circles — red for critical, orange for moderate — updating every 3 seconds. Click any circle and you get Gemini's plain-English description of what's there. And if you're planning a route, the safe route planner queries the hazard database and generates a detour around anything dangerous, like a campus-scale Waze built for bikes.
How we built it
The hardware side is an Arduino Uno R4 WiFi with a Grove MMA7660FC 3-axis accelerometer wired to the I2C port. The Arduino streams raw {x, y, z, rms, peak} readings over USB at 2Hz.
A Python bridge script (bridge.py) reads those serial packets in real time. It uses a low threshold — rms > 0.12g or peak > 0.20g — to detect that something happened. Crucially, it doesn't try to classify anything at this stage. It just starts recording. After 5 seconds, it bundles the time-series data with a camera frame (captured via camera.py using iPhone Continuity Camera) and POSTs the whole package to the backend.
The FastAPI backend receives the event, immediately stores a PENDING spot in MongoDB Atlas, and fires off a background task to call Gemini. The Gemini prompt sends the full formatted time-series — with a visual bar representation of vibration intensity over time — alongside the image, and asks Gemini to classify severity and generate a cyclist-facing warning.
The routing engine uses OSRM for bike-aware path planning, queries MongoDB's 2dsphere geospatial index to find hazards near any proposed route, clusters nearby hazards to avoid redundant waypoints, and tries both left and right offset detours to pick the shorter safe path.
The dashboard is vanilla JS + Leaflet.js and map rendering with CartoDB tiles.
Challenges we ran into
Accomplishments that we're proud of
The camera problem. When you mount a camera on a bike, it points wherever it points — sometimes at the road, sometimes at the sky, sometimes at your own hands. We spent a lot of time figuring out how to tell Gemini to ignore images that clearly aren't showing road conditions. The solution was explicit prompt engineering: if the image shows anything that isn't an outdoor road scene (indoors, hands, equipment), ignore it entirely and classify on vibration only.
False positives are the enemy. Early versions triggered constantly — someone picking up the bike, a cable vibrating, a speed bump immediately followed by smooth road. The fix was to send Gemini the full 5-second time series rather than a single moment. A single spike followed by calm reads completely differently to Gemini than 5 seconds of sustained roughness. We also tuned the GREEN classification to be aggressive: when in doubt, discard.
No real GPS. We didn't have a GPS module, so we simulated movement along UC Davis campus waypoints in the backend. It works visually and demonstrates the system correctly, but it's the biggest gap between the demo and a real deployment.
What we learned
Getting Gemini multimodal fusion to actually work well in real conditions. The GREEN filter does what it's supposed to: it discards the junk and only surfaces genuine hazards. Watching a legitimate pothole hit show up on the live map as a RED circle with Gemini's generated description was something that was really cool to see
The routing logic surprised us too. Trying both left and right offsets, clustering hazards so you don't end up with 10 redundant waypoints, adding a sanity check that rejects detours longer than 60% of the direct route.
What's next for Baze
The most important missing piece is real GPS. With a proper GPS module (or even phone GPS over Bluetooth), every hazard gets pinned to its exact location.
Right now one bike builds the map. If ten bikes are running Baze, the map becomes much more useful much faster — and hazards get correlated across multiple passes to confirm severity.
We also want to close the loop with UC Davis Facilities. The automated work order generation is already there in concept; wiring it to an actual email or API endpoint would make the repair pipeline real.
Longer term, an iOS/Android app could let any cyclist contribute sensor data from their phone's accelerometer — no hardware required — and receive hazard warnings before they reach them.
Built With
- accelerometer
- arduino
- c++
- camera
- cartodb
- fastapi
- javascript
- leaflet.js
- mongodb
- osrm
- python
Log in or sign up for Devpost to join the conversation.