I (Shane) wanted to create a product with social impact, but I also wanted to create a software project that used advance technology--preferably AR/VR. While brainstorming on the bus to VandyHacks, that's when it hit me: using AR/VR to help the legally deaf.

What it does

SensesAid is mobile app that allows the blind to see what the person in front of them is saying in almost real-time. Furthermore, they can choose to view the world in 2D; 3D Augmented Reality; or disengage visually with Virtual Reality, but still see what the person is saying.

How I built it

We used Android Studio to build the app, Google Speech Recognition to achieve speech to text with custom algorithms to enhance performance, as well as Vuforia with Google Cardboard integration for Augemented and Virtual Reality.

Challenges I ran into

Vuforia is horribly underdocumented, and there were many different software that had to be incorporated. Not to mention Google Cardboard was not originally meant for AR.

Accomplishments that I'm proud of

We're proud of being able to achieve our first VR/AR application, gain experience in Unity, and learn more about Android OS.

What I learned

What's next for SensesAid

Share this project: