The Problem

There is currently a dual problem when user-testing VR experiences. First, users find it difficult to give quick and location-based feedback while in VR, and user researchers don't have enough tools to receive this user feedback. Additionally, the accessibility and differently-abled community often cannot give feedback at all, resulting in a lack of communication with this community.

Inspiration

To make VR experiences more accessible, we need to hear from people with different needs through user testing. However, conducting user testing for VR experiences is difficult because these projects follow a similar approach for screen-based products, such as filling a survey after the VR user study, or asking users to recall the experience. This doesn’t take into account the isolating characteristics of the VR experience which makes it difficult for researchers to understand the context of the user’s note.

Therefore, our goal is to create a tool that allows user testers to provide timely feedback on VR projects by annotation in an efficient and accessible way.

The Solution

Our project allows users to directly give feedback in VR and allows user researchers to view this feedback asynchronously using a web-based dashboard. We've intentionally targetted a 3-thronged approach:

  • Lightweight: We've created a lightweight Unity package that allows any VR team to import our assets into their projects.
  • Inclusive: User testers will be able to use a variety of accessibility layers, including voice-to-speech, keyboard input, and VR screen reader support.
  • Insightful: User researchers can view screenshots of when users created feedback, as well as this feedback in text form.

This focus allows us to make sure our tool is quick to import and accessible to use.

How We Built It

After coming up with a first-round of ideas related to accessibility, we interviewed the XR accessibility interest group as well as RealityHack speakers/mentors. We slowly found out that there existed a gap between VR development teams and the user community (especially the accessibility community).

With this feedback, we used Figma/Figjam to iterate on ideas and Unity 2020 LTS (C#) to prototype. We also used Python to implement a simple WebSocket server that could communicate bidirectionally with the VR application (that has our plugin imported).

Challenges

We encountered a number of issues (as does any hackathon idea). Specifically, we had some trouble networking between applications, and getting voice recognition to work within the noisy surroundings of the hackathon space.

We also pivoted our idea many times, from a tool that tackles accessibility in current VR projects, to a way for development teams to directly interact with the accessibility community. However, these ideas slowly led to the final idea.

Accomplishments

At the end of the hackathon, we are proud to have engaged with many members of the accessibility and development community. We ended up with a working integration of the multiple systems from the VR application running on Oculus Quest to the dashboard/server running on a different computer.

What We Learned

We learned many new technologies, especially when it comes to accessibility. This includes multi-input sources (i.e., keyboard source, voice input source, voice decision functions).

What's Next

We want to add more accessibility layers to our tool, including haptic technology, and support for different types of hearing impairments. Overall, we are interested in providing new input sources for the users who cannot type the keyboard or cannot use voice input sources.

Built With

Share this project:

Updates