Inspiration
1 in 10 adults have food allergies, and more than half of them have experienced a severe allergic reaction. Travelers and immigrants face life-threatening risks when they can't read local menus or communicate their dietary restrictions, making eating out at restaurants a hazard. It can be hard to try new foods and eat out with your friends.
What it does
Point your camera at any menu and take a snapshot. It will translate each item, highlight risky allergens and ingredients based on your personal allergen, recommend items based on your preferences, and can even place the order for you in the local language. Scan & translate: Instant OCR + translation for menus in any language Allergen detection: Highlights risky ingredients based on your personal profile Safe ordering: Places orders in the local language on your behalf Personalized recommendations: Suggests items based on your past orders.
How we built it
UI/UX: We designed the basic layout in figma. Technical: Mobile-first PWA on Replit as a pnpm monorepo: React + Vite + TypeScript. An Express + pino backend shares a single OpenAPI spec that generates the typed client and zod validators, so the front and back stay in sync. Gemini 2.0 Flash handles menu OCR, dish-level allergen analysis, and cross-contamination scoring — streamed to the UI over SSE so risk pins drop onto the photo as they're computed. Backboard powers a chat assistant that remembers each traveler across sessions, persisted per-device in MongoDB. ElevenLabs speaks the allergy disclaimer in the local language for the waiter.
Challenges we ran into
Our first version of the menu scan felt painfully slow Gemini took 8–10 seconds to return the full analysis of a menu, which is forever when a hungry traveler is staring at their phone at a restaurant table.
Accomplishments that we're proud of
We got that end-to-end scan down to a near-instant feel by streaming Gemini's output dish-by-dish over Server-Sent Events instead of waiting for one big JSON blob. Risk pins now drop onto the photo progressively as each dish is analyzed, so the user sees results in under a second and the perceived latency basically disappears.
What we learned
Streaming doesn't just feel faster it actually is. By moving Gemini to SSE and analyzing dishes in parallel instead of one giant request, total scan time dropped from ~8–10 seconds to ~2, and the first risk pin lands in well under a second. Designing the AI pipeline as a stream from day one was the single biggest win of the weekend.
What's next for EpiScanner
Reaction logging that feeds a personalized risk model; the assistant learns your body, not just generic allergens. Doctor & family sharing plus an SOS mode with the user's emergency card and nearest hospital in the local language. Verified restaurant partners so green pins become certified, not just inferred. More languages, cuisines, and a watch / Live Activity for one-tap access during the meal.
Built With
- figma
- gemini
- replit
Log in or sign up for Devpost to join the conversation.