We saw how difficult it was for the blind and the deaf to interact with the world. We then wanted to make their life easier by introducing an IOS app and and Android app that can aid them in interacting with the world.

What it does

We have two versions of the App: a text to sign language which helps blind people interact with deaf people and a ML app that detects the sign language of a person and the app transcribes the signs into written English which could be understood by the general public.

How we built it

As a team, we specialized in different things. For example, Sahith was proficient at X-Code so he built the iOS app, while Yash being proficient at Android Studio and Backend Firebase, and lastly, Tanish built our web development structure and CSS. All together our services work together to help the community in various methods. We specifically using firebase linked our app and website, while our other app is an addition to help people in various other methods.

Challenges we ran into

Detecting the appropriate sign figure with the right character.

Accomplishments that we're proud of

Solving this issue along with producing two perfectly working apps on IOS and Android.

What we learned

We learned how important FireBase is in updating information on the fly to the website, as used by our IOS app as it is a real time handling transcriber.

What's next for Aid-Ware

Correcting the grammar of the person who is doing the hand-signs, which could act like an "auto-correct" for deaf people.

Share this project: