Inspiration

Our project was inspired by the false assumption that there are immersive experiences available for all people, while they are actually only accessible to able-bodied people. The underlying assumption, within the world of immersive technology, is that everyone has a free range of motion. This leads to a lack of integration for assistive technology for individuals with different levels of mobility.

Controllers demand the use of limbs, hands, and fingers, which is not available to people who are affected by quadriplegia. Our goal is to establish a new standard in the field, which used binary control so that people with different abilities have access to WebXR.

What it does

Our project uses a binary input function from the analysis of sip-and-puff mechanics -- technology created in the 1960s for individuals who are unable to use their limbs, and therefore, use their mouths to “sip” or “puff” air into a device (usually a straw or tube) that is linked to hardware or software that enables them to carry out day-to-day functions. https://www.youtube.com/watch?v=Bhj5vs9P5cw

How we built it

We used JavaScript & A-frame to create the VR experience, as well as Maya for reducing poly-count and editing models, Rhino for building free-models to scale, and Google Drive for writing notes and a slide presentation. All models are integrated into A-Frame.

Challenges we ran into

We were being challenged to learn more efficient ways of executing our tasks and methods of input that consider breathing as the baseline to trigger interaction. We were also learning how to integrate assistive technology independent mechanics of binary input functions in VR. Converting files has been a common theme of challenges we have faced so far.

Accomplishments that we're proud of

We explored the possibilities of creating access to VR for people with different physical abilities and introducing a change within the field of VR for the greater good of humanity. We built a functioning first version of an A-Frame component "binary-controls" that will be released to the A-Frame registry to coexist with other more common input controls like "gamepad, keyboard or hand" controllers We defined three basic timed binary input sequences as sufficient mechanics to navigate and interact within VR: Click Event to dispatch (click): on - [0ms - 500ms] - off Skip/Tab Event to change focus (long-click): on - [500ms - 1000ms] - off Toggle Event to change input mode interaction<>locomotion (click&hold): on - [3000ms - 4000ms] - off https://docs.google.com/presentation/d/16rS1c6x0khgrrDCbxteArcZk8oBHtkEyH-wlqHWvQLA/edit?usp=sharing

What we learned

Navigating in the embodiment of a quadriplegic user gives you a new definition of the time and determination it takes to execute simple tasks that able-bodied users are likely to overlook. The demo makes you more patient and empathetic but also opens up opportunities for social inclusion for quadriplegic users in VR.

What's next for AccessibleLocomotionWebXR

After a few performance and animation updates, we will be publishing the component on the A-Frame registry. We will try the component with the quadriplegic users in collaboration with EqualEntry hosts of A11YNYC. A projection for Accessible Locomotion XR is the AR component… We are currently creating a model for VR experience. Eventually, we hope that the methods we introduce here are taken into a broad range of XR experiences and creates an inclusive environment for locomotion.

Built With

Share this project:
×

Updates