Inspiration

Our team spent a significant amount of time exploring project ideas that would meaningfully support an underserved community. We began by identifying common interests such as sports, gardening, and music etc., and then examined the challenges different communities might face when engaging with these activities.

We noticed that many music and accessibility tools focus heavily on captions or visualizers, but fewer focus on haptics as the primary experience. We wanted to create something centered on the Deaf and hard-of-hearing community, not as an afterthought, but as the core user.

This idea became VYBE: an app that transforms live audio into meaningful vibration patterns so Deaf users can feel the beat, tempo and intensity of music in real time.

What it does

VYBE listens to audio through an iPhone and converts sound into real-time vibration patterns.

Bass --> stronger, deeper vibrations

Tempo --> vibration pulse speed

Volume Changes --> vibration intensity shifts

By capturing audio through an iPhone, VYBE analyzes sound and converts them into meaningful haptic feedback based on the song's rhythms and tempos, allowing Deaf users to experience rhythm, beat, and energy even without hearing a single note.

How we built it

We built VYBE as an iOS app using Swift and SwiftUI, with a focus on real-time performance and accessibility. Throughout development, we iterated frequently with guidance, refining both the audio input pipeline and the haptic feedback system.

Core Technologies: --> SwiftUI for user interface and app structure --> AVFoundation for capturing live microphone audio and monitoring audio levels --> Core Haptics for generating vibration feedback

Audio to Vibration Pipeline At a high level, the app works as follows:

  1. Capture live audio from the device microphone
  2. Measure the real-time audio level (amplitude)
  3. Map changes in audio intensity to haptic strength
  4. Trigger vibration feedback continuously on the device

Challenges we ran into

We faced a few challenges when building VYBE.

One of the main challenges was incorporating haptic feedback using Core Haptics. Implementing this framework was difficult because it required understanding how advanced vibration patterns work on iOS and ensuring that the haptics responded consistently to live audio input.

Another challenge involved testing. Testing vibrations was not possible on Xcode simulators, which meant we had to use a physical iPhone device paired with Xcode in order to properly test and debug haptic behavior.

Accomplishments that we're proud of

We are proud of building a working app that successfully converts audio input into real-time vibration feedback, but more importantly, we are proud that accessibility was at the core of our project from the very beginning rather than something added later. This project pushed us to think deeply about inclusive design and the responsibility that comes with creating technology for underserved communities.

Throughout the process, we learned new technical skills while also learning from each other through research, experimentation, and collaboration. These shared experiences shaped how we approached problem solving and accessibility, and they are lessons that will continue to influence the way we build technology moving forward.

What we learned

Through this project, we learned the importance of accessibility-first design and how haptic feedback can be used as a powerful alternative to sound. We also gained hands on experience working with real-time audio input, Core Haptics, and iOS development, which was a newer experience for most of us.

What's next for VYBE

We plan to expand VYBE by improving vibration accuracy, adding customization options for different music styles and further exploring Apple Watch integration. Our goal is to continue refining VYBE into a more immersive and accessible way to experience music.

Built With

Share this project:

Updates