the navigation apps available nowadays are not designed for the physically disabled people

What it does

it helps the people with multiple disabilites (deaf+blind,blind+dumb,blind+deaf+dumb)

How we built it

we plan to use the here apis to create an app based on a smart glove embedded with sensors that can convert sign language to text (input for the app)and can help the user in choosing the right direction

Challenges we ran into

understanding apis solving the problem statement stimulus to be given without sound and vision

Accomplishments that we're proud of

we came up with an idea that can potentially the problems faced by disabled people in the field of self navigation

What we learned

we learned to use the apis we learned how to plan the project accordingly

What's next for Medusa

after the first hackathon medusa plans on polishing the idea further and preparing themselves for the future hackathons

Share this project: