Inspiration
What if there was a way to bridge the divide between nonverbal individuals with autism and neurotypical people? Between paralyzed patients and healthcare professionals?
This was the inspiration for DigiSpeak, a motion-sensing glove that assigns movement with meaning. Following similar ASL detection projects like SignGlove, Sign Aloud, and other human integrated designs, the intention of this project was to empower nonverbal individuals to express their needs. Additionally, we sought a Hack-a-thon friendly project that would combine all our interests as aspiring electrical, computer, and biomedical engineers.
What it does
Our project takes in the finger movement of the user and displays its associated meaning in an open computer window.
Our current word library consists of:
- Peace: bent thumb
- Gig’em: bent middle
- No: bent index
- Wave: side-to-side hand motion
For example, the user makes a thumb’s up, and the word “Gig’em,” displays onscreen.
How we built it
Hardware
- Weaved flex sensors unto glove and attached gyroscope to back of hand
- Wired sensors and gyroscope to Arduino
- Fit Arduino and bread board to a custom 3-D printed case
Software
- Used C++ to read data from sensors and upload to csv file
- Processed data in Python to recognize finger extension/extension and palm orientation
- Outputted data using TKinter onto computer screen
Challenges we ran into
Initially, we ran into several hardware issues. We originally attempted making our own flex sensors; however, our DIY sensors weren’t detecting flexion correctly. Additionally, we realized our gyroscope was fried. These issues were resolved by ordering new parts, which enabled us to start our hardware assembly in less than a week.
On the software side, we ran into some issues with processing the raw data from the flex sensors. We ended up incorporating Python to format and calibrate the inputted information.
Accomplishments that we're proud of
As our first Hack-a-thon, we’re proud to have come together as a team to create a fully working project. Specifically, we’re proud of taking a simple motion, like bending a finger, and translating it into language.
What we learned
As a team, we learned more about Arduinos, C++, circuitry. Specifically, we learned how to integrate microcontrollers, sensors, and software across multiple applications.
What's next for DigiSpeak
- Expand to finger flexion/extension to include ASL signs
- Utilize a Recurrent Neural Network in a Machine Learning approach to increase scope and accuracy
- Bluetooth connection to a mobile app for users
Log in or sign up for Devpost to join the conversation.