Our mission is to foster a culture of understanding. A culture where people of diverse backgrounds get to truly connect with each other. But, how can we reduce the barriers that exists today and make the world more inclusive? Our solution is to bridge the communication gap of people with different races and cultures and people of different physical abilities.

What we built

In 36 hours, we created a mixed reality app that allows everyone in the conversation to communicate using their most comfortable method:

You want to communicate using your mother tongue? Your friend wants to communicate using sign language? Your aunt is hard of hearing and she wants to communicate without that back-and-forth frustration?

Our app enables everyone to do that.

How we built it

VRbind takes in speech and coverts it into text using Bing Speech API. Internally, that text is then translated into your mother tongue language using Google Translate API, and given out as speech back to the user through the built-in speaker on Oculus Rift. Additionally, we also provide a platform where the user can communicate using sign language. This is detected using the leap motion controller and interpreted as an English text. Similarly, the text is then translated into your mother tongue language and given out as speech to Oculus Rift.

Challenges we ran into

We are running our program in Unity, therefore the challenge is in converting all our APIs into C#.

Accomplishments that we are proud of

We are proud that we were able complete with all the essential feature that we intended to implement and troubleshoot the problems that we had successfully throughout the competition.

What we learned

We learn how to code in C# as well as how to select, implement, and integrate different APIs onto the unity platform.

What's next for VRbind

Facial, voice, and body language emotional analysis of the person that you are speaking with.

Built With

Share this project: