Inspiration

Chinmayee has been wanting to work on a similar project at University for some time, so when we came to BrickHack and saw the abundance of sign language users, we figured it was a perfect place to actually do it!

What it does

The AR app built for the Vuzix glasses has two functions. On the main screen, it can detect sign language and display it's guess to what is being signed on the screen. When swiping to the next screen, a user can speak with another glasses wearer to display a live translation of what is being said.

How I built it

We built this project using the Vuzix glasses, communicating with a Tensorflow model, along with Google Compute Engine and the Translate API.

Challenges I ran into

We ran into a few challenges, the version of the glasses we were using do not have Google Play Services support, so we could directly use the Translate API, or Firebase. We also ran into network issues when trying to stream video data.

Accomplishments that I'm proud of

Everything! The project, the demo, and the team <3.

What I learned

We all learned a lot, whether it was regarding machine learning, AR development, or websocket work :).

What's next for Babel AR

We plan to continue this at University with a 3d camera and better training data!

Built With

Share this project:

Updates