Inspiration
We track almost everything about our health: steps, sleep, heart rate. But we spend 80% of waking hours in conversation with people, and we measure nothing about it. Not the quality. Not whether we were present. Not whether the person across from us felt heard. The loneliness epidemic is about more than just being alone. It’s about relationships where conversations seem superficial or where people talk but don’t truly connect. That gap is what inspired us.
What it does
Conversense makes the quality of human conversation visible. Using only your phone's existing sensors and Apple Watch, it tracks four real human senses: interoception, chronoception, audition, and social emotional sensing and translates them into a single ambient experience. At its center is a living hourglass that responds to conversation quality in real time. After every session it surfaces one insight, one pattern, one moment worth remembering. No recordings. No words captured. Ever.
How we built it
We started with research. Secondary sources pointed to a clear crisis: loneliness doubling since the 1980s, employees leaving jobs over poor communication, and a generation that is always connected but rarely present.
We then surveyed 25 participants to understand how people perceive their own conversations. 60% didn't notice a relationship drifting until it was too late. 80% couldn't describe their own conversational patterns without someone else pointing them out first.
That confirmed the core problem. People don't lack the desire to connect. They lack the mirror.
From there we mapped what a phone could actually detect without invasive hardware. We designed the full experience in Figma around three real human stories.
Challenges we ran into
The biggest challenge was bringing the hourglass to life. We had a clear vision of sand moving, responding, breathing with the conversation but translating that into a prototype was harder than we expected. We tried Figma Make, hit walls, tried again, and eventually scrapped everything and rebuilt it from scratch in Figma.
Measuring time perception was another puzzle. How do you ask someone how long a conversation felt without breaking the moment? We went through several ideas before landing on a simple visual: letting people select how the conversation felt rather than type a number.
We also spent a lot of time debating which metrics actually mattered. There were so many signals we could surface. We kept going back and forth until we landed on three that felt honest and human: safety, connection, and equity.
Accomplishments that we're proud of
We built something that genuinely doesn't exist yet. There are tools for sales calls and customer service conversations, but nothing built purely for the people in your life. That felt worth being proud of.
The sand visualization was something we weren't sure was even possible. Getting it to feel alive, not mechanical, took a lot of attempts. When it finally worked it was a real moment for the team.
We're also proud of the emotional precision in the copy. And honestly, we're proud of what we left out. It would have been easy to fill every screen with charts and metrics. Instead every screen has one thing to feel, one thing to read, one thing to do.
What we learned
We learned that safeguards are not a feature you add at the end. Every decision we made around what the app listens to, what it stores, and what it never touches shaped the entire design. Getting that right early changed how we thought about everything else.
We learned Figma Make the hard way. It taught us a lot about what AI tools can and can't do right now.
We also learned that prototyping is never really done. Every version taught us something the previous one couldn't. The app you see now looks nothing like where we started and that's a good thing.
But the biggest lesson was about storytelling. We spent as much time on the three human stories as we did on the screens themselves. And every time we presented it through a real person, a real moment, a real conversation that had gone quiet, people got it immediately. The story was the design.
What's next for Conversense
The natural next step for Conversense isn't in your pocket. It's in your room. Imagine a small ambient object that sits on your coffee table, your desk, your bedside. No screen. No notifications. Just a soft glow that responds to the quality of the conversation happening around it.
Built With
- claude
- figjam
- figma
- figmamake
- figmaslides
Log in or sign up for Devpost to join the conversation.