Inspiration
Imagine standing in a crowd at a concert. The bass is in your chest, the crowd is electric, and you are completely immersed. Now imagine you cannot feel any of that. The music plays, everyone around you moves, but the sensation is inaccessible to you.
By 2050, nearly 2.5 billion people are projected to have some degree of hearing loss, and more than 700 million will require hearing rehabilitation. Existing assistive solutions vibrating vests, specialist hardware are expensive, bulky, and isolating. We wanted to ask a di erent question: what if you could feel sound through something already in your pocket?
What it does
Haptikos is a real-time haptic audio experience delivered through any iPhone. Users scan a QR code and their phone instantly joins a live audio session. As music plays, the browser analyses the sound and translates distinct frequency bands into unique haptic patterns on the device:
Rumble - deep, sustained sub-bass (20-80Hz), like a kick drum in your chest Tap - punchy bass hits (80-250Hz) Pulse - rhythmic mid-range texture (250Hz-2kHz) Snap - crisp high-mid transients (2-8kHz), like a snare crack
A bass guitar feels di erent from a hi-hat. Music has texture, not just vibration.
How we built it
Audio analysis. The browser's WebAudio API runs a Fast Fourier Transform (FFT) at 60 frames per second, splitting incoming audio into ve distinct frequency bands. When an energy spike is detected in a band, a HapticEvent object is generated. Real-time transport. Events are relayed via Socket.IO to a Node.js server, then forwarded to connected iPhones. End-to-end latency sits under 50ms the gold standard at which haptic feedback feels synchronous with the music rather than like an echo. Haptic rendering. On the iPhone we built a custom Core Haptics engine. The four haptic recipes (Rumble, Tap, Snap, Pulse) are tuned to map each frequency range to a distinct tactile sensation, preserving the character of the original sound. Dashboard. An audio visualiser displays concentric rings representing sub-bass, mids, and highs in real time, so the experience is both felt and seen.
Challenges we ran into
Latency. Keeping the pipeline under 50ms required careful optimisation of the FFT window size, Socket.IO event batching, and Core Haptics scheduling on the iOS side. Haptic vocabulary. Designing recipes that felt meaningful not just generic buzzes required many tuning iterations. The di erence between a bass guitar and a snare must be immediately perceptible. Network reliability. Public venue Wi-Fi is unpredictable. We built a manual IP-entry fallback so the demo would survive a QR scan failure. Frequency separation. Muddy or compressed audio makes band isolation di cult. We tuned our energy-spike detection thresholds to be robust across di erent music genres.
Accomplishments that we're proud of
Afullyworking, end-to-end demo real music, real phone, real haptics built in a single hackathon. Sub-50ms latency achieved without native code on the server side. Four distinct, perceptually meaningful haptic recipes that preserve the character of sound, not just its presence. An accessible entry point: any iPhone, any audio source, no specialist hardware required.
What we learned
Welearned that accessibility and delight are not opposites. The same technology that helps someone with hearing loss feel a re alarm also makes a concert feel richer for everyone. Designing for the edges of human experience often produces the most universally valuable tools. We also deepened our understanding of the WebAudio API, real-time socket architectures, and the surprisingly nuanced art of haptic design where timing, intensity, and waveform shape combine to produce sensation rather than just noti cation.
What's next for Haptikos
Semantic classi cation. Teaching the app to distinguish music from safety-critical sounds ( re alarms, doorbells, sirens) providing safety alongside soul. Apple Watch integration. Extending haptics to the wrist for a more seamless, hands-free experience. Spotify connection. Direct integration so any stream triggers haptics automatically, without routing audio through a browser. Broader audio source support. Unlike Apple's built-in features (limited to Apple Music), Haptikos targets any audio source. Polished iOS application. Moving from hackathon prototype to a published app available to the deaf and hard-of-hearing community worldwide
Built With
- avfoundation
- corehaptics
- express5
- fft
- javascript
- node.js
- socket.io
- swift
- swiftui
- typescript
- webaudioapi
Log in or sign up for Devpost to join the conversation.