1. Motivation & Inspiration

Music is often seen as an art form accessible to everyone, yet much of it is designed for those who can hear. Surprisingly, many deaf people do enjoy music—they feel the rhythm through vibrations, experience it as a form of poetry through lyrics, and even develop unique ways to interpret the melody. There is also a practice known as "song signing," where songs are converted into sign language through expressive movements. However, the current music industry often overlooks the deaf community, creating music predominantly for hearing audiences. Unfortunately, these efforts are sometimes criticized by the deaf community for misrepresenting or oversimplifying their experience, often losing the true essence and nuances of music.

To truly make music inclusive for deaf people, we believe it's really important that they are included from the production process. This inspired us to create a platform where both deaf and hearing people can collaborate in music creation. Our goal is to make music composition a shared experience, where deaf creators are not just included but are integral contributors, shaping the music beyond traditional boundaries.

2. What It Does

Our platform is a collaborative music composition tool that allows deaf and hearing people to create music together using XR (Extended Reality). It enables users to visualize, feel, and shape music in a shared space that transcends traditional musical notes.

When a traditional producer creates a melody using a piano, our system visualizes it through spatial representations along with corresponding vibrations. This allows users to feel the music in an abstract way that resonates with their experience. After the initial composition, both deaf and hearing users can collaborate on modifying the music by adjusting the visual shapes and experiencing the resulting changes through haptic feedback. Hearing producers can experience how deaf listeners perceive the music through vibrations, while deaf producers use gestural interactions to convey their ideas and modify the music by altering the visuals. For example, they can highlight sections they want to make more intense by opening their arms, which will increase the intensity accordingly. This approach helps deaf users, who may have difficulty understanding or expressing music through traditional notation, to engage with the music intuitively. By providing an interface that allows both groups to express their mental models through abstract feelings, our platform creates a new, shared language for music composition.

3. How We Built It

We used XR technology to create an interactive and immersive space where both deaf and hearing creators can engage with music in their own ways. Here is how we implemented the system:

  • Hand Tracking: We used an interaction SDK to track hand movements, allowing us to capture the interaction with piano tiles and gestural controls.

  • Visual Representations: Musical notes are converted into visual elements that can be manipulated spatially, providing an abstract representation of musical qualities like intensity and rhythm.

  • Haptic Feedback: We used a haptic SDK to simulate the vibrations created through the music sequence, allowing deaf users to experience the music through tactile sensations.

  • Gestural Interactions: We incorporated complex gestures to modify visual representations, which are then mapped back to alter the musical elements. For instance, users can enlarge visual elements to increase the intensity of a particular sound by stretching the visual representations, facilitating intuitive adjustments.

4. Our Initial Focus & First Achievements

Since we don't completely understand the experience of deaf users, we decided to start our prototype with the most simple but fundamental feature: intensity. Our first version of the prototype focused on enabling users to communicate intensity within the music sequence. By visualizing intensity as the size and timing(proximity) of spheres as the , we made it possible for deaf users to convey how they feel the music should evolve. This approach allowed us to lay the foundation for a shared language between deaf and hearing creators. We're proud of how this initial version successfully facilitated communication about music in a way that feels natural to both groups.

5. Challenges We Ran Into

One of the biggest challenges we faced was understanding the different ways deaf people experience and imagine music. As we are not deaf users ourselves, we had to imagine these experiences, which added another layer of complexity. Every individual's experience is unique, making it difficult to create a universal interface that works for everyone. Additionally, since our platform integrates gestural interactions and abstract visualizations, we had to create our own design standards, as there were no existing guidelines for this type of interaction. Finding the right balance between simplicity and expressiveness was a constant challenge.

6. What We Learned

Through building this prototype, we learned that music is more than just sound—it's an experience that can be felt, seen, and shared in many forms. The process taught us the importance of involving users from the community we are designing for. We gained valuable insights into how abstract concepts like rhythm and intensity can be communicated in ways that go beyond auditory experiences, and how important it is to create a shared language that resonates with both deaf and hearing people.

7. Possible Future Plans

Moving forward, we may conduct interviews or studies to gather insights directly from the deaf community to ensure that our tool aligns with their preferences and needs. We also see the potential for this platform to be scaled beyond just deaf and hearing users. Our vision is to create an inclusive music composition tool that can be used by novices without a background in music, allowing them to explore and create music through intuitive and abstract concepts. We believe that, with more development and user feedback, our platform can become a powerful tool for creative expression accessible to everyone.

Built With

Share this project:

Updates