Machine Learning Sign Language Application

This is a web app built with React and Flask, making full use of the machine learning IBM-Watson API.

It uses a custom machine learning model trained on over 1600 images of the American Sign Language alphabet to allow users to interpret sign language gestures they are unfamiliar with into text.

In the future, this project can be extended to include many more gestures beyond the alphabet characters, and to interpret dynamic gestures involving movement in addition to static gestures.

#CANSOFCOM
C
A
N
S
O
F
C
O
M

Share this project:
×

Updates