Inspiration
Tools like Google Translate, Duolingo, and Babbel exist to help language learners practice their target language and receive instant feedback on their progress. However, no such tool exists for providing feedback to students learning American Sign Language. Technology has not yet advanced to the point where a fully-fledged American Sign Language machine translation system is possible.
As someone who has taken several ASL classes, I've observed that this technological limitation means that it is very time consuming for instructors to provide detailed, individualized feedback to each student. This also limits the amount of "skill and drill" practice exercises that an ASL instructor can assign. A student may learn a sign incorrectly and internalize the incorrect version before they have the opportunity to receive feedback.
What it does
Simon Signs is an educational MR app to help students practice ASL vocabulary words through instant, individualized feedback. This is not meant as a substitute for professionally taught ASL classes, but rather an asynchronous vocabulary practice tool. A 3D teacher avatar, Simon, demonstrates the proper way to sign a word. The student copies the sign in a sort of "Simon Says" game, while paying close attention to proper form. If their form is incorrect, the discrepancies will be highlighted on Simon's hands, and the student can try again with this new feedback in mind.
How we built it
While we didn't build a prototype, we are imagining that this heavily relies on the Pico Hand Pose Generator Script. The hand pose configuration provided by this interface is remarkably detailed, and seems tailor-made to track several critical parameters of ASL signs:
- Handshape: How the hand joints are curled, flexed, and abducted, and how bones are positioned relative to each other
- Location: Where the hands are located relative to the signer's body
- Orientation: Which way the palms are facing relative to the signer
Margin is important for identifying whether a student has signed a word correctly.
Challenges we ran into
There are a few gaps in the Pico hand tracking API that would need to be closed to recognize the wide range of ASL signs:
- Track distance between joints on separate hands
- A way to define movement within a single sign, perhaps via defining pose "keyframes"
- Obstructions between hand joints
- Recognize signs close to body
What's next for Simon Signs
We'd need to consult with ASL teaching professionals to develop lesson plans/levels for Simon Signs. Linguists, psychologists, and education specialists would also have valuable insights into how we could best gamify this practice tool.
Log in or sign up for Devpost to join the conversation.