Contextual – SensAI Hackathon Submission
Inspiration
We’re building toward a world where AI is ambient, not another screen you have to check. Every meaningful moment in life happens in context—where you are, what you’re doing, what you care about. But today’s AI still waits behind prompts, apps, and interfaces.
We wanted to flip that: What if AI met you exactly where you are? What if it whispered the right thing at the right time—based on movement, place, intention, and identity?
With the brand-new Meta Glasses SDK (released Wednesday!) and a weekend inside Frontier Tower, that vision became something we could prototype.
⸻
What it does
Contextual is a proactive, location-aware AI concierge for Meta Smart Glasses.
It delivers just-in-time micro-assistance by understanding: • Where you are (parking lots, grocery stores, airports, gyms, campuses) • What you’re doing (walking, entering, exiting, pausing) • Your personal loops (reminders, shopping items, habits, goals) • Your trajectory through physical space
Example use case: You pull into the grocery store parking lot → Contextual whispers: “Hey, here’s your list — want to add anything before you head in?”
As you walk inside, it filters by aisle and guides you hands-free.
It’s AI that feels alive, contextual, and with you — not waiting to be asked.
⸻
How we built it
We built 100% of Contextual on-site during the hackathon — starting from an empty Xcode template. No prewritten app, no imported code, no legacy architecture. Everything you see in this demo was designed, engineered, and assembled here at SensAI.
- Integrated the brand-new Meta Glasses SDK (released Wednesday)
We were among the first developers to touch this SDK. We integrated: • Voice input → iOS app • Whisper audio output → glasses • Real-time event triggers • Hands-free conversational loops
Building a working glasses → app → AI → glasses pipeline in under 48 hours was a major feat.
- Built a new iOS architecture from scratch during the event
We created: • A LocationService for live GPS + geofencing • Motion activity + micro-transition detection • A Context Fusion Layer to interpret movement + environment • A trigger engine for contextual events • A voice-first interface optimized for wearables
Every file and subsystem originated on-site.
- Designed and built the “Little Loops” engine
A lightweight memory system that captures micro-intents like: “Add olive oil… remind me to stretch… get vitamin D.”
It tags and triggers these loops automatically based on place and motion.
- Rapid backend scaffolding
We added a minimal datastore for contextual memories and retrieval based on geospatial cues.
- Semantic AI orchestration
Sensor data → meaning → whisper. Our system translates physical-world signals into semantic prompts for the model, making AI feel like a quiet, observant companion.
⸻
Accomplishments that we’re proud of • Building an entire iOS app from scratch on-site in two days • Integrating a Meta SDK that was released two days before the hackathon • Creating a working prototype of ambient, contextual AI for real-world use • Crafting a new interaction model: AI that moves with you, not waits for input • Developing a reusable geospatial intelligence layer for future world experiences • Delivering a magical whisper-based demo that feels like the future
⸻
What we learned • AI becomes exponentially more powerful when it stops waiting to be prompted • Wearables—not phones—are the natural home for contextual AI • People want frictionless experiences, not more apps • The intelligence of an assistant comes from timing, not volume • Quietness is a design principle • Context isn’t just metadata — it’s the new interface
⸻
What’s next for Contextual • Deep integration with Meta Glasses spatial anchors • On-device micro-models for faster contextual inference • A general-purpose Context Graph to learn patterns over time • Retail aisle-level experiences with partner stores • Travel + airport navigational whispering • University campus mode • Festival and theme park versions (in partnership with world apps) • Open Contextual as a developer platform for ambient AI
Ultimately, Contextual becomes the operating system for real life — an AI that understands the world you’re walking through and supports you without pulling you out of it.
Built With
- avfoundation
- combine
- core-location/motion
- github
- meta-smart-glasses-sdk-(released-wednesday)
- openai-api
- supabase/firebase
- swift/swiftui
- userdefaults/sqlite-lite
- xcode-16

Log in or sign up for Devpost to join the conversation.