Inspiration
Inspired by a 17 year old who deployed stockfish into the Meta glasses, instead, we instead, implemented a complex facial recognition AI agent system that enables you to rizz up your crush.
What it does
Imagine this: You're talking to your crush and you don't know what to say. You start sweating, your knees weaken, your stomach drops. But, FEAR NOT because Rizz Khalifa has entered the chat (literally inside your glasses).
With one click, AI agents are instantly deployed straight into your ears feeding you lines. Here is how it works:
- Facial recognition identifies who you are talking to (yes, it remembers and saves your situationships face).
- Our speaker diarization models analyze whoβs actually talking, so it wonβt accidentally tell you to say "haha that's so funny" when youβre the one who just made the joke (we've all been there).
- Agents search your conversation history and user-generated knowledge bases for context.
- A sentiment analyzer rates your conversation (so you know if you're winning or friend-zoned :/).
- Our web search agents grab up-to-date info on any topic you have no clue about. She mentions Taylor Swift's latest album? Youβre now a die-hard Swiftie.
- The best part? It says this directly into your ear through the Meta Raybans so only you can hear it.
Rizz Khalifa is the ultimate C2C SAAS because everyone deserves a chance (even CS kids).
How we built it
- Backend:
FastAPI,
Langchain,
LangGraph, Supabase, Groq, Vader Sentiment - Models: text-embedding-3-small, llama-3.3-70b-versatile, whisper-large-v3, gpt-4o-mini
- Frontend: Flutter, Dart
- Features: Voice diarization, Facial recognition, RAG, Multi-agent System, Real-time low latency multi-agent system, Sentiment Analysis
- Tools: Cursor
Challenges we ran into
Speech to text LLM's have severe hallucination issues. A known issue being here. Such hallucinations interfered with our models in generating outputs, thus a lot of time was spent on filtering out hallucinated words.
Accomplishments that we're proud of
Deploying AI agents that works directly with the Meta Ray Bans glasses is a product that has yet to have been made. Due to limitations from the glasses, video feeds could only extracted from live streams, making it an extremely difficult challenge.
What we learned
- Establishing a connection between a mobile app with an LLM
- Building a server that enables both the mobile app and LLM's to read and write to Supabase
What's next for RizzKhalifa Glasses
Rewriting the codebase to allow customizable LLM's to be run subsequently or in parallel is the next task for RizzKhalifa. This would create a platform that enables other developers to easily build custom LLM's for the Meta Ray Bans.
Log in or sign up for Devpost to join the conversation.