One of our colleague has a hearing disability and we often see him struggling to communicate properly which led us to come up with the idea of a sign language interpreter.
What it does
It recognizes sign language gestures and outputs as a text and an audio.
How we built it
We built an application through android studio using Kotlin and utilized the text to speech inbuilt within android studio to generate speech when we receive the message through the leap motion sensor via firebase.
Challenges we ran into
We had difficulties setting up the firebase real time database and on both the leap motion sensor side and the android studio side. It was a challenging aspect of the project which we managed to overcome through research and also help from mentors and the unihack team.
Accomplishments that we're proud of
Being able to build a mobile app which could receive messages from the leap motion sensor and say the message out loud. Our application was similar to what we envisioned and we were very proud of the final product.
What we learned
Firebase real time database was something which we all had to learn during this project. Additionally working in a software development team was a lot more challenging than we expected as different members had different roles and it was difficult initially for everyone to have something to do. Additionally we also learned to present our product both in a technical and business sense to make the product appeal in all aspects.
What's next for UNIHACK-19-Team30
We will further to enhance our project and potentially to take it to the next level by enhancing additional features like speech to text, AI recognition of words into sentences. Also we aim to take this project towards in being a startup as we are all passionate about the potential of the project and potentially how it could help others who are in the same conundrum as our friend without large economical costs for them.