Inspiration

We were inspired by the communication challenges faced by people with speech or hearing impairments. We wanted to create a simple, affordable tool that translates basic hand gestures into readable text in real time.

What it does

LUMA is a wearable glove that detects finger bends and hand movements and sends the recognized gesture as text to our custom MIT App Inventor mobile app via Bluetooth.

How we built it

We used flex sensors, an accelerometer, an Arduino, and a Bluetooth module for hardware gesture detection. We programmed gesture-matching logic in C++ and designed a clean, responsive app interface in MIT App Inventor.

Challenges we ran into

Calibrating sensor values for consistent gesture recognition was difficult. Bluetooth instability, noisy sensor readings, and synchronizing timing between the glove and app were also major challenges.

Accomplishments that we're proud of

We built a fully working hardware + app system in limited time. LUMA reliably recognizes multiple gestures and displays them instantly on a smartphone, creating a smooth user experience.

What we learned

We learned about sensor calibration, gesture mapping, Bluetooth communication, and working with cross-platform hardware/software integration. We also understood the importance of clean UX design.

What's next for LUMA

We plan to add more gestures, integrate text-to-speech, train a machine learning model for dynamic sign recognition, and convert LUMA into a compact wireless smart glove.

Share this project:

Updates