We were inspired to create this product by the idea of breaking language barriers between all people. When we say all people we try to include not only people that speak different languages, but also people that have hearing difficulties or impairment.

What it does

Our application listens to what a person in front of you is saying and adds a 3D speech bubble with what he is saying written down. We also added an optional feature aimed at people who know sign language, that overlaps a pair of 3D arms and when it identifies certain key words, the arms perform the gestures in an animated manner.

How I built it

We built this application for iOS devices using Swift, and different libraries that let us use new technologies such as Augmented Reality and Computer Vision in order to create a pleasant experience for the user.

Challenges I ran into

We ran into several challenges, ranging from training the model to recognize hand gestures correctly, to getting the augmented reality elements to appear in the desired place in the 3D environment. And also some issues of user experience, we faced some challenges trying to make the experience as smooth as possible.

Accomplishments that I'm proud of

We were proud of being able to mix some new and powerful tool to integrate a complex and innovative app in such a short period of time. We're also excited about what our app could be used for.

What I learned

In this Hackathon we learned a lot about teamwork, time management and organization. We also got a lot of experience in using new technologies like Vision and AR libraries made for iOS developers

What's next for SpeechBubble

We plan to add more languages to the available languages for translation and add arm animations.

Built With

Share this project: