Through our research, we discovered that many of the limitations experienced by people with Parkinson’s disease can be overcome with perceptual tricks. While the pace of walking on a flat surface is staggered and interrupted, the continuous motion of using a staircase--or even the illusion of a staircase--can enable a person with Parkinson’s to walk with a regular gait. And while many encounter difficulty extending an arm to interact with an object, the introduction of a moving target provides an active stimulus that prompts a response.

Click here for the inspiration behind our project (3:45 - 5:05).

We thought: What if we take this painted staircase illusion and apply it anywhere in the world?

What it does

Amviewlate (a riff on "ambulate" and "view") assists with two key problems:

Freezing of gait, an ambulatory disturbance that results in slow, shuffled walking

Dyskinesia, the impairment of voluntary movement, particularly without stimuli to trigger reflexes

Amviewlate projects visual cues in augmented reality to facilitate walking and interacting with objects. A staircase illusion helps simulate the act of climbing stairs, enabling continuous motion. A visual cue guides the hand to an intended target. All actions are governed by voice control software to enable full agency in users with limited gesture control. Through Amviewlate, we hope to restore mobility, independence, and confidence in our users.

How we built it

For this project, we quickly set up an AR application using Google Cardboard and a marker recognizer library called Vuforia. The next challenge was to integrate speech recognition as part of the app. There were no free Android speech recognizer assets available on Unity, so we had to build one ourselves by integrating Android plug-ins to the Unity project. Finally, we set up the whole scene in the application to provide a compelling experience for people with Parkinson's disease.

Challenges we ran into

We ran into several challenges while developing Amviewlate. It was important to us to incorporate voice controls to help people with Parkinson’s (who have trouble with precise gesture controls) navigate the AR experience. However, connecting the Android speech recognizer with Unity and exporting to a phone proved challenging.

Even after building the speech recognition library onto the phone, we realized we weren’t able to trigger the AR effects from voice alone without a preceding cue to trigger the phone to listen for certain words or phrases.

Building the 3D AR effects was also challenging, as none of us were skilled at 3D modeling, so finding and adjusting 3D models required some trial and error.

Accomplishments that we're proud of

Our team hacked together a myriad of libraries and technologies to create a unique experience, bringing powerful speech recognition and augmented reality technology to any smartphone via Google Cardboard.

Through guerrilla research of academic papers and testimonials about Parkinson’s disease, as well as paper prototyping of the AR interactions, our team broke down and addressed specific problems unique to Parkinson’s patients. This research guided the three key interactions built into the final AR prototype: The stair illusion to walk steadily on flat surfaces, the stomping animation to aid in making sharp turns, and the ball catching illusion to grab objects.

What we learned

We learned about the rich research literature on assistive technology for Parkinson’s disease. We found several videos and papers describing specific visual cues (such as the staircase illusion) which help with motor control for people with Parkinson’s. While developing the app, we learned which of the cues best lent themselves to being recreated with augmented reality.

We also learned how to apply AR and speech recognition tools to an Android app.

What's next for Amviewlate

While the Google Cardboard has the benefit of using lightweight, affordable technology (a smartphone and a mobile headset), the spatial recognition capabilities are limited. In our demo, we got around this by using the Vuforia library to trigger AR effects based on image targets placed throughout a space. However, using VR headsets with spatial recognition would enable us to generate the AR effects dynamically using computer vision to recognize paths, passageways, and obstacles in an environment. Luckily, better spatial recognition is within reach with the next generation of AR headsets, such as the Hololens or Metavision.

We are also eager to validate and iterate on our AR prototype through consultations with medical specialists as well as user testing with people with Parkinson’s disease.

Share this project: