Inspiration

Around 50% of autistic individuals and many people with ADHD experience alexithymia, a reduced ability to identify and name emotions. Most mental health apps open with "how are you feeling?" For someone with alexithymia, that question goes nowhere. We wanted to start somewhere more accessible: the body. Soma asks where you feel it, not what you feel.

What it does

Soma is a mobile-first PWA for neurodivergent individuals who struggle to name their emotions. Users tap an interactive body silhouette to mark where they notice physical sensations, then select descriptor qualities for each region; whether it feels tight or loose, heavy or light, still or spreading. Soma sends that sensation map to an AI model, which returns two or three emotional hypotheses framed as possibilities rather than conclusions. Sessions are saved so users can track patterns over time.

How we built it

Next.js 14 with App Router, Tailwind CSS, Supabase for auth and persistence, and an AI model accessed through server-side API routes. The body silhouette is an interactive SVG with pointer event handling. A hard-coded crisis safety layer runs before every AI call, independent of the model. Deployed on Vercel.

Challenges we ran into

Mobile compatibility took more time than expected. Getting the SVG body map to behave consistently across iOS Safari, Android Chrome, and desktop required significant debugging around pointer events, tap target sizing, and viewport scaling. iOS Safari's PWA quirks around safe area insets and touch handling cost us more time than we expected.

Accomplishments that we're proud of

Shipping a working end-to-end loop in under 24 hours. The AI output framing landed well in testing; hypotheses feel grounded in the user's actual input rather than generic. The crisis safety layer is fully independent of the AI model, which was a deliberate and important design decision.

What we learned

The hardest part was the language. In a tool for this population, word choice matters more than in most products. "This sometimes accompanies X" reads completely differently than "you are feeling X." We also learned to test on iOS Safari first.

What's next for Soma

Longitudinal pattern analysis, where the AI identifies recurring connections between specific sensations and emotional states across sessions. A guided interoception check-in mode. An anonymous session option for users who prefer not to create an account. Eventually, a shareable summary users can bring to therapy sessions.

Built With

Share this project:

Updates