Inspiration
Over 280 million people suffer from depression globally. But what’s even more alarming? Millions more don’t even realize they’re emotionally struggling. Emotional literacy is the missing link. Same happens with astronauts. An astronauts’ mental health can be impacted by prolonged isolation, confinement, and separation from loved ones, which can lead to anxiety, depression, sleep problems, and homesickness. They suffer from sleep disruption as the rapid orbit of the ISS causes 16 sunrises and sunsets daily which messes up their body clock. There are many more factors like- environmental factors, communication delay, work load and exhaustion.
What it does
Emotion Overlay via AR: Glasses analyze facial micro‑expressions and voice tone of conversation partners, projecting subtle visual cues (e.g., color patterns) indicating mood detection. Mirror Feedback Mode: Users receive gentle nudges about their own emotional state (detected through micro muscle tension, speech patterns, and blink rate).
How we built it
We made the code in python and used gemini pro for fixing small errors. We used matplotlib for the emotion graph and for recognizing the face we used DeepFace.
Challenges we ran into
eveloping AR emotion glasses presents a complex fusion of hardware and psychological hurdles, primarily the "Hidden Face" problem, where cameras mounted on the frame have a distorted, close-up perspective of the wearer’s muscles, making traditional facial recognition models ineffective. Beyond the physical constraints of managing thermal output and battery life in a slim form factor, the system must overcome the subjectivity of human expression, as micro-expressions are easily misinterpreted by AI across different cultural contexts or lighting conditions. These technical difficulties are further compounded by significant privacy and ethical concerns, as the real-time "psychological scanning" of bystanders without consent risks social alienation and sensitive biometric data leaks.
Accomplishments that we're proud of
Could revolutionize mental health culture by normalizing empathy training. Schools, corporations, and families would adopt at scale. As a startup, it could easily expand into telehealth integration and workplace wellness markets.
What we learned
Through the development of AR emotion glasses, we learned that human connection is too nuanced for sensors alone; we discovered that while AI can track muscle movements, it often misses the "why" behind an emotion, requiring a shift from simple detection to providing contextual social "nudges." We realized that privacy is the ultimate feature, as the technology is only viable if both the wearer and the observer feel psychologically safe, leading us to prioritize local, on-device processing over cloud storage. Finally, we learned that hardware ergonomics are non-negotiable, as even the most powerful emotional insights are useless if the device is too heavy or runs too hot for daily social interaction.
What's next for AR Emotion Detector Glasses
AI-Assisted Social Coaching: Gamifies empathy by suggesting real-time empathetic responses in daily conversations.
Log in or sign up for Devpost to join the conversation.