Inspiration

We wanted to build a world where everyone can experience sound, even if they cannot hear. Many deaf or hard-of-hearing people miss critical warnings, a simple knock at the door, or a baby crying. We are computer science students, so we built Feel-IT: a smart, wearable device that turns specific, important sounds into labeled vibrations, letting the user feel and understand their surroundings.

What it does

Feel-IT is a two-part system: an iPhone app "brain" and an ESP32 wristband "feeler."

  • The iPhone app uses Apple's SoundAnalysis machine learning framework to listen for and classify over 100 sounds in real-time.
  • The user opens the app and selects only the sounds they care about (e.g., "Baby Crying," "Fire Alarm," "Doorbell") from a pre-defined list.
  • When the app detects a selected sound with confidence $C > 50\%$, it sends the human-readable name (e.g., "Fire Alarm:99") via Bluetooth Low Energy.
  • The ESP32 band receives the text, displays it on the screen, and triggers a sharp, distinct "Triple Click" vibration using a professional DRV2605L haptic driver. It intelligently ignores all unselected sounds to prevent alert fatigue.

Feel-IT System Diagram

How we built it

  • iOS App (Swift): We used AVAudioEngine for audio capture, SoundAnalysis for ML classification, SwiftUI for the sound selection UI, and CoreBluetooth to manage the BLE client.
  • ESP32 Wearable (C++): We used a LILYGO T-Display for the screen and CPU. We used the Adafruit_DRV2605 library to control the haptic driver over I2C. The C++ code runs a BLEDevice server that listens for incoming string data from the app.

Challenges we ran into

This was our first time with advanced BLE and system-level audio services.

  • XPC Crash: Apple's SoundAnalysis service repeatedly crashed when the keyboard appeared! We fixed this by building a resilient SoundBridge that gracefully handles AVAudioSession interruptions.
  • Silent BLE Failure: Data wouldn't send. We learned BLE commands must run on the main thread. The fix was a single, crucial line: DispatchQueue.main.async.
  • C++ Data Corruption: The ESP32 received garbage text. We fixed a C++ pointer bug (ch->getValue().c_str()) by safely copying the data to a stable Arduino String before processing.

Accomplishments that we're proud of

  • We built a fully functional, end-to-end accessibility device that connects an iPhone app to custom hardware.
  • We successfully debugged and fixed complex, low-level bugs in C++, Swift, and BLE communication.
  • We created an intelligent filter (the SoundStore) that prevents "alert fatigue" and makes the device genuinely useful.
  • We integrated a professional haptic driver (DRV2605L) instead of just a simple buzzer.

What we learned

  • Multithreading is critical: BLE operations must be on the main thread, while audio analysis must be on a background thread.
  • State management is key: We learned to manage the ESP32's state (Ready vs. Listening) by sending commands from the iPhone.
  • How to debug hardware: We diagnosed C++ type-casting errors, pointer bugs, and library-specific issues by reading serial logs and isolating problems.
  • User Experience > Technology: Our best feature isn't detecting sounds; it's ignoring the ones the user doesn't care about.

What's next for Feel-IT

Our next step is to build a "Teach Mode" for custom sounds (like a specific microwave beep) and design a 3D-printed case to make it a true, everyday wearable.

Built With

Share this project:

Updates