Estimates have shown that the population of native ASL speakers falls between 250,000 and 500,000, but there have always been a shortage of education in ASL in U.S schools and colleges due to the lack of certified teachers. As a result, many Deaf and Hard-of-hearing students face many challenges in the classroom and everyday life: facing challenges with hearing aids and lip reading, social concerns, language deficiency, lack of support and empathy from others. Seeing this problem, we have decided to build this hand-gesture recognition A.I. to assist the learning of sign language for everyone.

Aim & Scope

With this product, we aim to:

  • Bridge the gap in education for Deaf and Hard-of-hearing students
  • Help address the shortage of instructors and interpreters in sign languages.
  • Provide resources in learning sign language for all students.

Code Explanation

  • The first cell is importing the libraries
  • The next 6 cells are to import the information from training and testing CSV file on google drive of the pixels of different hand gesture images
  • Then it organizes the data and prepares it for training
  • Then some more libraries were imported and the neural network is created
  • Data training
  • The AI predicts


We believe that with this product, we could help to better the educational experience for everyone, especially the hearing-impaired. We believe that it would become a convenient, helpful, and easy-to-use tool for everyone.

*Also created by Fujia Wang who couldn't join devpost for some reason.

Built With

Share this project: