Yume: Dream in color. Wake in clarity.
🌙 What inspired us
We kept noticing the same gap: the people who need emotional support most are exactly the ones who can't access it. Journaling requires insight you don't have yet. Therapy requires language you haven't found. Meditation requires stillness that feels impossible.
Meanwhile, every night, the brain is doing the emotional work the waking mind avoids — processing stress, grief, and unresolved feeling during REM sleep. That data has always existed. No tool had ever tried to read it.
Yume started from a simple question: what if it could?
✨ What we built
Yume is a biosensor sticker and companion app that captures emotional patterns during sleep and surfaces them each morning in a way that's gentle, honest, and actionable. The sticker reads brainwave activity at the temples. EEG data is processed on-device overnight, converting raw neural signals into an emotional landscape — not dream narration, but emotional texture. How you actually felt, not what happened.
Users wake up to a Morning Reveal: a feeling, a color, and one optional prompt. They can go deeper if they want — or just absorb it and move on.
We also designed three delivery modes (Gentle, Standard, Deep) so users control exactly how much they see, and when. Because emotional data isn't like a step count — it can surface things people aren't ready to face, and we took that seriously.
📝 What we learned
Designing for emotional safety is harder than designing for engagement. Every decision — how a word is phrased, when a prompt appears, how much data is shown — carries real weight when the subject is someone's inner life.
We also learned that people deeply want this. 85% of our 27 survey respondents said they'd want insight into emotional patterns in their dreams. The demand is there. The tool just didn't exist yet.
🎯 Challenges we faced
The hardest design challenge was information delivery: how do you surface something as sensitive as subconscious emotional data without overwhelming or destabilizing the user? The delivery modes system was our answer — but getting the language and UX right took many iterations.
Privacy was also a core constraint from day one. Neural data can reveal mental health conditions, trauma history, and emotional vulnerabilities — often before the user is aware of them. We committed early to local on-device processing, full user data ownership, and zero monetization of emotional data.
👩🏻💻 How we built it
We used Figma for our UI/UX design & prototyping and Google Forms to conduct our user research survey.
🤍 A special thanks
We would like to give a huge thank you to Figma for hosting and Joanna Chen for all the support! We would also like to give a shoutout to our family and friends for their moral support and for helping us with our research! :)
With love, Cindy Chiang & Trinah Maulion ⋆˙⟡♡
Built With
- figma



Log in or sign up for Devpost to join the conversation.