Inspiration
We were inspired to create Gesture by a deep desire to make the world a more inclusive place. We believe that everyone deserves to be heard and understood, regardless of the way they communicate. It's painful to watch someone struggle to communicate and be unable to help them. We knew that we had to do something to change that.
What it does
Gesture takes an image as input and classifies the user's hand gesture into a letter of the ASL alphabet. The power of this simple technology is profound - it can bridge the communication gap between people who are mute or those who are deaf and hard of hearing. Our goal for the future is to develop an app that can automatically string together full sentences and make communication even easier.
How we built it
Building Gesture wasn't easy - it required long hours and a lot of hard work. We used Roboflow and Python to build our machine learning model, training it on a dataset of nearly 500 images that we had to individually photograph and data label by hand. This process was painstaking and time-consuming. We also had to learn Swift and new machine-learning training techniques to make our project a reality.
Challenges we ran into
Our team faced many challenges during the hackathon, some of which nearly derailed our project. Our initial team left us, leaving only two of us with only a few months of programming experience to complete the project on our own. We also had to start over after training on poor hand gesture data, which was a devastating setback. Our biggest challenge was not being able to add full-string functionality to our app before the deadline, despite working for almost 18 hours straight. We were heartbroken, but we knew that this was only the beginning.
Accomplishments that we're proud of
Despite the challenges we faced, we're incredibly proud of what we've accomplished. We created a working model that can identify ASL hand signals, and we built something with a significant use case that can help those who can't speak be better understood. The feeling of accomplishment and pride we felt when we saw our app successfully identify a hand gesture was indescribable. It was a validation of all the hard work and dedication we put into this project.
What we learned
The last 24 hours taught us a lot about ourselves and what we're capable of achieving. We learned new Swift and machine learning training skills, and we learned the importance of perseverance and dedication. We also learned the value of teamwork and the power of having a common goal. We realized that we could accomplish anything we set our minds to, even when it seems impossible.
What's next for Gesture
Our next steps for Gesture include turning our algorithm into a mobile IOS application, seeking funding for our efforts, and adding more than just letters to our machine-learning model - including full ASL words, live-video functionality, and more. We're determined to make Gesture the best it can be, and we know that this is only the beginning of what's possible.
At Gesture, we believe that every voice deserves to be heard. We hope that our project inspires others to build technology that can make the world a more inclusive and understanding place. Together, we can change lives and create a brighter future - one sign at a time.
Log in or sign up for Devpost to join the conversation.