Sign language is currently detected using cameras and vision. Unfortunately, for deaf people, this means that everywhere they travel, they need to have this setup of camera+algorithms to explain their thoughts to someone who does not know sign language. We want to switch the setup, and give the people who use ASL the power, by using the my armband to recognize and synthesis the signs

Built With

Share this project:

Updates