Web app that detects ASL gestures through Leap Motion and outputs corresponding Letters.
According to a 2005 study by Gallaudet University, two to four people out of every 1,000 in the United States are functionally deaf. Many deaf people use sign language because spoken languages can be difficult to understand and use. We decided to develop a project that will help deaf people lives more easier and more interactive.
How we built it
IBM BLUEMIX (Node red, Text to Speech, IBM IoT Watson)
GITHUB (Team Work Collaboration)
Challenges we ran into
Connecting third-party service to IBM Watson Internet of Things
Classifying similar gestures using Leap API