Inspiration

In Ontario, every restaurant is inspected by public health, but almost nobody checks the results before eating. Toronto has DineSafe stickers on restaurant doors. Waterloo Region has nothing. Restaurants here aren't required to display their inspection outcomes. The data exists on a government portal that virtually no consumer has ever visited.

We asked ourselves: what if the data came to you, right when you needed it: at the restaurant door, in the 10 seconds before you decide to walk in?

App Clips are designed for exactly this kind of moment: a real-world trigger, a single focused task, instant value, no install. Yet almost every App Clip ever built is a commerce flow — order food, rent a scooter, pay for parking. Nobody has used the format for consumer protection. ClipCheck reframes what an App Clip can be.

What it does

ClipCheck lets anyone scan a QR code at a restaurant entrance and instantly see:

  • Trust Score — An animated 0–100 gauge computed from weighted public health inspection history, color-coded from green (safe) through amber (caution) to red (avoid)
  • Inspection Timeline — A horizontal, interactive timeline of every inspection with pass/conditional/closed status dots
  • Violation Details — Expandable cards for each infraction with severity classification (Minor, Significant, Crucial)
  • AI Safety Advisor — Gemini analyzes the full violation history and produces a plain-English safety assessment with risk level, identified patterns, and actionable advice
  • Personalized Meal Recommendations — AI recommends what to order (and what to avoid) based on the specific violations found, current weather, time of day, and the diner's allergens and dietary restrictions — all without an account or stored preferences
  • Voice Briefing — ElevenLabs reads the full personalized safety summary aloud — one tap, hands-free
  • Nearby Safer Options — If the score is low, surfaces higher-rated restaurants within walking distance
  • QR Scanner — Built-in camera scanner for physical codes at restaurant entrances

No download. No account. No login. Scan → know → decide. Value delivered in under 15 seconds.

How we built it

ClipCheck is built entirely in SwiftUI on the Reactiv ClipCheck simulator framework, with zero external dependencies — no SPM, no CocoaPods, no Carthage.

Data layer: We sourced real inspection records from the Region of Waterloo's open data portal and Toronto's DineSafe dataset, then normalized them into a unified JSON schema with restaurant metadata, inspection dates, compliance statuses, and infraction details with severity levels. Trust scores are computed using a recency-weighted algorithm — the most recent inspection counts for 60%, the second for 25%, the third for 15% — with penalty scaling by severity (Crucial: −15, Significant: −8, Minor: −3) and trend adjustments for improving or declining patterns.

AI engine: We use the Gemini API (gemini-3-flash-preview) with carefully engineered prompts that feed the model the full inspection history alongside four real-time context signals:

  1. Weather via the Open-Meteo API (free, no key) — current temperature and conditions in Waterloo affect food safety risk and what a diner should order
  2. Time of day — classified into five meal periods (breakfast, lunch rush, afternoon, dinner rush, late night), each with distinct food freshness and safety implications
  3. Allergens & dietary preferences — parsed directly from URL query parameters embedded in the QR code, enabling per-scan personalization with zero stored data
  4. Violation pattern analysis — Gemini identifies systemic issues across inspections (e.g., recurring temperature control failures) rather than treating each violation in isolation

The response is parsed into structured fields: safety summary, risk level, weather-aware tip, time-aware tip, allergen warning, recommended orders, and items to avoid.

Voice: ElevenLabs TTS converts the full personalized briefing into natural speech, with AVSpeechSynthesizer as an offline fallback.

Scanning: AVFoundation's AVCaptureMetadataOutput handles real-time QR detection. We also built a QR generator that produces printable codes — including personalized variants with allergen parameters baked into the URL — for live demo use.

Design system: Custom card components with material backgrounds, a consistent SF Symbol icon language, trust-level color coding (green #22C55E, amber #F59E0B, red #EF4444), and spring-based animations throughout.

Challenges we ran into

The dual-file trap. The Xcode project uses PBXFileSystemSynchronizedRootGroup to auto-compile Swift files from a directory — but we had two copies of the submission files in different locations. We spent hours editing code that compiled successfully but was never actually loaded at runtime. Debugging this required reading the raw project.pbxproj to discover that only one of the two declared sync groups was assigned to the build target.

Gemini prompt engineering. Getting Gemini to return consistently parseable output across different restaurant types and violation profiles required significant iteration. The model would sometimes return Markdown formatting, sometimes skip fields, sometimes hallucinate restaurant-specific menu items it couldn't know. We settled on a strict label-based format (e.g., WEATHER_TIP: ...) with explicit instructions to avoid Markdown, and built a robust fallback system that generates intelligent recommendations from the raw inspection data when the API is unavailable.

Personalization without persistence. The Reactiv challenge asks how to personalize with no user history. Our solution — encoding allergens and dietary preferences directly in the QR code URL — is elegant but required careful coordination between URL generation, parameter parsing, the dietary selector UI, and the AI prompt pipeline. Every component needed to gracefully handle the presence or absence of these parameters.

Accomplishments that we're proud of

Four-signal contextual personalization with zero stored data. Weather, time, allergens, and violation patterns combine to produce genuinely unique recommendations for every scan. Two people scanning the same restaurant at different times of day with different allergens get meaningfully different advice. This directly answers Reactiv's core challenge.

Real data, real restaurants. Every restaurant in ClipCheck is a real establishment in Waterloo Region or Toronto with real inspection records. During our demo, judges can look up the same restaurants on the Region of Waterloo website and verify the data matches.

Comprehensive fallback architecture. If the weather API is down, we fall back to hardcoded Waterloo conditions. If Gemini is slow or unreachable, we generate assessments directly from inspection records. If ElevenLabs fails, AVSpeechSynthesizer takes over. If the camera doesn't work, manual URL entry and demo cards are ready. The app never breaks — it gracefully degrades.

A novel App Clip category. Commerce, transit, food ordering — every App Clip use case has been done. Consumer food safety at the point of decision is genuinely new.

What we learned

App Clips are underexplored. The constraints (no persistent storage, no login, 15 MB, URL-invoked) sound limiting, but they force a design discipline that produces better experiences. Every feature in ClipCheck delivers value because there's no room for bloat.

Context replaces accounts. The conventional approach to personalization requires user profiles and history. ClipCheck proves that environmental context (weather, time, location) combined with a single URL parameter can produce personalization that feels just as relevant — and far less invasive.

AI works best with structured context. Giving Gemini raw data and saying "analyze this" produces generic output. Giving it structured inspection history alongside weather, time, and dietary signals — and asking for specific labeled fields, produces recommendations that feel genuinely tailored and useful.

What's next for ClipCheck

Provincial scale. Every province in Canada publishes restaurant inspection data in some form. Ontario alone has 90,000+ food premises. ClipCheck's architecture — URL-invoked, data-driven, AI-analyzed — scales to any jurisdiction that publishes inspection records.

Real App Clip deployment. Moving from the Reactiv simulator to a production App Clip target with Apple's App Clip framework, associated domains, and Smart App Banners would make ClipCheck work directly from an NFC tap or a visual code scan in the camera app — no app needed at all.

Live data ingestion. Replacing the bundled JSON with a lightweight backend that pulls from municipal open data APIs in real time would keep inspection records current without app updates.

Expanded context signals. CoreLocation for true proximity-based nearby alternatives. HealthKit integration (with permission) to pull allergen profiles automatically. Crowd-sourced menu data to make dish-level recommendations even more specific.

Built With

Share this project:

Updates