Inspiration
While researching human sensory experience for this brief, one of our team members came across a medical case involving proprioception. This sense tells your brain where your body is in space. Most people have never heard of somatoparaphrenia, and that's exactly what drew us to it. It's a condition in which someone looks at their own limb and genuinely doesn't recognize it as their own. The limb is physically there, no amputation, no injury, but the brain has stopped claiming it. What struck us was that this came up in an academic context, in class, in presentations, but never as something anyone was actively designing for in the real world. That gap felt worth closing.
What it does
Veira is a clinical AR therapy system for people with somatoparaphrenia. It pairs AR glasses with a companion mobile app to guide patients through progressive body ownership exercises, grounded in the rubber hand illusion. The glasses overlay a prosthetic rendering onto the patient's real limb, and through repeated guided motor tasks, the brain is gradually encouraged to reclaim it. Sessions are structured in phases, each one unlocked by the patient's clinician when they're ready to progress. The app tracks every session, logs neurological metrics, and gives the patient a clear picture of their recovery over time. When distress occurs during a session, the glasses recognize it, and the app guides the patient through breathing support before seamlessly returning them to where they left off. And because recovery can feel isolating, Veira includes a community space where patients can connect with others navigating the same condition, so no one has to go through it alone.
How we built it
We designed the full system in Figma, the AR glasses interface and the companion mobile app, and built a working interactive prototype using Figma Make. The prototype runs a live use case showing a real session: pairing, guided motor tasks, a distress moment, breathing support, and a clinician-gated phase transition. Everything in the demo is functional.
Challenges we ran into
The hardest part was designing for something you can't fully simulate. AR glasses at this fidelity don't exist yet, and somatoparaphrenia is rare enough that there's very little established treatment to reference. We had to make clinical design decisions without a clinical background and balance making the experience feel medically credible while still being something a real patient could use on their own. Designing the relationship between the clinician and the patient inside the app was also genuinely difficult. Getting that power dynamic right, where the clinician has meaningful control without the patient feeling passive in their own recovery, took a lot of iteration.
Accomplishments that we're proud of
We're proud that the prototype actually works. The full session flow, pairing, guided tasks, distress support, breathing, and phase transitions, all of it is functional and interactive. We're also proud of the physical-to-AR continuity detail, the idea that the real arm casing mirrors the AR rendering at each phase so the patient is never jarred when the glasses come off. That wasn't in the brief; it came from genuinely sitting with what it would feel like to be Noah. And honestly, we're proud of choosing this condition at all. It would have been easy to design for something more familiar. Somatoparaphrenia is rare, underserved, and invisible. That's exactly why it needed this.
What we learned
We learned how much the brain can be retrained through repeated multisensory experience. The rubber hand illusion gave us the scientific foundation. If the brain can be tricked into claiming a fake hand through synchronized touch and vision, it can also be guided back toward claiming its own. But one of the most important things we realized during the design process was something that goes beyond the screen. In our phase system, the AR glasses render different prosthetic overlays: a robotic arm in early phases, a more lifelike prosthetic sleeve as the patient progresses. What we built in is that the physical case on the patient's real arm mirrors what the AR is showing at each phase. So when they take the glasses off, they're not suddenly confronted with something unfamiliar. The real and the virtual stay close enough that the brain isn't shocked. It's progressive, the AR leads, the physical follows, and over time, the gap between what Noah sees and what is actually there closes. That felt like the most human insight we had in this whole project.
What's next for Veira
A more connected you: The immediate next step is clinical validation. Veira needs to be tested with neurologists and occupational therapists to stress test the phase structure and session design against real treatment protocols. From there, the AR hardware itself needs to exist, purpose-built glasses that can deliver the rendering fidelity the therapy requires. We also want to expand the community layer of the app, connecting patients across geographies who are navigating the same condition, because right now, most people with somatoparaphrenia have never met anyone else who has it. Long term, the same system could extend to related conditions like asomatognosia and xenomelia, anywhere the brain's sense of body ownership has broken down and needs a structured path back.
Built With
- figma
- figma-make

Log in or sign up for Devpost to join the conversation.