abilityVR is an innovative initiative designed to transform orientation and mobility training for individuals who are blind or visually impaired. By utilizing the immersive capabilities of Virtual Reality, abilityVR recreates realistic environments enriched with spatial audio to provide users with a safe and controlled platform for practicing essential navigation skills. This approach addresses the limitations of traditional training methods, offering a scalable and accessible solution that empowers users to gain confidence and independence.

With a focus on user-centric design and advanced auditory-driven experiences, abilityVR serves individuals with blindness from birth or due to accidents. Beyond improving mobility skills, this project highlights the potential of VR to create effective, engaging, and inclusive assistive technologies.


Inspiration

The idea for abilityVR sparked during a seemingly ordinary moment. One of our team members observed a group of blind individuals crossing a busy road, guided by their mentor. Each person had their hand on the shoulder of the person ahead, relying entirely on touch and auditory cues to navigate safely. This scene made us wonder: "Can we use VR to enhance the mobility training experience for blind individuals?" The answer lay in leveraging Spatial Sound to create an innovative solution.

While traditional training methods for the blind are effective, they are constrained by high costs, limited accessibility, and an inability to replicate diverse real-world environments. These barriers can lead to reduced confidence, social isolation, and limited opportunities in education, employment, and recreation.

We believe that XR isn't just about creating stunning visuals—it's about designing experiences that enhance our surroundings. This project aims to empower blind individuals by providing them with immersive VR-based training that prepares them for real-world tasks in a safe, controlled environment before facing them directly.

To ensure the effectiveness of this solution, we visited the Blind People Association in our city. We learned about the challenges they face, the tools they use, and their daily routines. We also explored their experiential zone, Vision-in-the-Dark, a pitch-dark environment designed to simulate blindness for sighted individuals. This experience highlighted the critical role of sound in their navigation and interactions, inspiring us to prioritize auditory-driven solutions in abilityVR.

Case Study


What it does

abilityVR leverages Meta’s Spatial Audio SDK to create an immersive, auditory-driven experience for blind users, guiding them through various environments using sound cues. The project also integrates haptic feedback to indicate interactions with objects within the virtual space, enhancing the user's experience.

We have developed three key modules to facilitate learning and mobility training:

Environment Module

This module immerses users in a virtual forest environment, where they can experience the sounds of animals and birds placed throughout the scene. The user must follow these sounds to locate and identify different animals and birds, helping them enhance their auditory navigation skills.

Mobility Module

Focused on road safety training, this module simulates a bustling urban environment with sounds of traffic, restaurants, playgrounds, and pedestrians, mimicking a busy city street. The primary goal is to teach users how to safely cross the road at a zebra crossing, with auditory cues and haptic feedback guiding them through the process.

Experience Zone

In this interactive experience, users are introduced to a drum-playing simulation, where they can hear the spatial sound of drums and cymbals. The aim is to help users learn the rhythm and play along within the VR environment, enhancing their auditory coordination.

The app also includes an onboarding module to introduce users to the application’s features. Each module contains clear instructions to guide users through the experience, including how to interact with the spatial sound and haptic cues, ensuring a fully immersive and intuitive training session.


How we built it

abilityVR was developed using a combination of following tools and technologies to create an immersive and interactive VR experience:

  • Unity: The core platform for developing the VR experience.

  • C#: The primary programming language used for scripting the logic and interactions within the application.

  • Blender: Used for 3D modeling and creating assets.

  • Blender Kit: Used to download character models and vehicle assets.

  • Meta Quest Audio SDK: This is the key tool for creating spatial audio, ensuring realistic sound positioning that guides users through their environment.

  • Meta Quest Interaction SDK: for enabling user interaction within the VR environment, including the use of controllers for navigation and interaction with objects.

  • Meta Haptic Studio: Used to create dynamic haptic feedback that corresponds with different sounds and interactions.


Accomplishments that we're proud of

We are particularly proud of three key outcomes achieved during the development of abilityVR:

  1. Successfully integrating spatial audio and haptic feedback to create a highly immersive VR experience.
  2. Creating a VR experience that is fully accessible to individuals who are blind or visually impaired. This accomplishment is especially significant as it demonstrates how VR technology can be adapted to meet the unique needs of blind users, empowering them to engage with and learn in a virtual space.
  3. The most important aspect of this project is that it serves as a proof of concept, showcasing how Spatial Audio and haptic feedback can be leveraged to create meaningful and effective VR experiences for blind individuals. By integrating these technologies, we've demonstrated that immersive training systems can be developed to support the independence and mobility of blind users.

abilityVR aims to address the significant challenges blind individuals face in navigating and interacting with their environment. While they have the same auditory capabilities as sighted individuals, they often rely heavily on hearing and tactile cues, which can be insufficient for confident mobility and independence. By using VR technology, abilityVR offers a solution that creates immersive, accessible training experiences, simulating realistic environments where users can practice and develop essential mobility skills safely and consistently.


What we learned

  1. Spatial Audio Integration in Unity:
    We gained valuable experience integrating spatial audio within Unity using Meta’s Spatial Audio SDK. Understanding how to position and implement sound in 3D space was crucial in creating an immersive experience, and it significantly enhanced our ability to guide users through virtual environments using only auditory cues.

  2. Haptics Creation Using Meta Haptic Studio:
    We learned how to design and implement custom haptic feedback using Meta Haptic Studio. This technology allowed us to create dynamic tactile sensations that correspond to specific interactions within the VR environment, deepening the sense of immersion and improving user engagement.

  3. Collaboration and Adaptability During Rapid Development:
    Throughout the project, we learned the importance of collaboration and adaptability. Working in a team environment with diverse skill sets and ideas helped us overcome challenges and rapidly iterate on solutions. Being flexible and open to feedback ensured we could refine the project within tight development cycles.

  4. The Importance of Audio Cues in Daily Life: This project highlighted the significant role audio cues play in our daily lives, especially for individuals who are blind or visually impaired. It reinforced the idea that sound is not just an enhancement but a vital tool for navigating and interacting with the world, emphasizing the importance of considering auditory feedback in accessibility-focused designs.


What's Next for abilityVR

In the current version of abilityVR, we have developed one experience for each module to demonstrate the power of spatial sound and haptic feedback. Moving forward, we plan to expand the project with additional experiences, including public transportation, beach area, training with global & safety sounds and some games like find the ball.


Special Note

This project is specifically developed to assist blind individuals by providing a training experience that prepares them for real-world situations. Since the focus is on blind users, the core component of this experience is spatial sound. We aim to leverage spatial audio to make VR experiences accessible to blind people, offering them a truly immersive training tool.

We are applying this project in the Hobbies and Skillbuilding track. The primary goal of this track is to bring ‘regular’ people into the ecosystem, and we believe this project is an excellent way to integrate blind individuals into this space. By offering an accessible VR training experience, we aim to bridge the gap and empower blind users to confidently engage with technology.

While the experience does include some visual elements, sound is the true hero of this project. To fully appreciate the impact of this technology, we encourage everyone to try out the APK and experience it firsthand. When doing so, we kindly ask that you try our Blind Mode to better understand how spatial sound and haptic feedback drive navigation and interaction within the VR environment. This approach will give you a clearer understanding of how these sensory elements are crucial to making the experience both accessible and engaging for blind individuals.

Please find the following quick actions for easy testing:

  1. Blind Mode: Start button (☰) of the left controller.
  2. Skip Intro: B button of Right Controller.
  3. Menu Selection: Long press the Grip button (3 seconds).

Built With

+ 16 more
Share this project:

Updates