The inspiration for this project came from the dedication and hard work of the Deaf community and family and friends of the Deaf community. When seeing members of the Deaf community discussing their experiences learning ASL who now use platforms such as social media to teach ASL to members of the non-disabled community, we felt inspired to create a tool that could be used for community members beginning to learn ASL. In addition to individuals living with hearing disabilities, we were inspired by stories of families of these individuals in their efforts to learn ASL. We also felt inspired by Deaf African Americans who have shared information and knowledge on Black American Sign Language (BASL). As we felt that there had not been a well-established ASL learning tool that included both learning capabilities and the inclusion of BASL, we felt that ASL Helper could meet that demand.
What it does
ASL Helper serves as an ASL learning tool for individuals interested in developing their sign language skills. It provides several services, including translating words to sign language, teaching/quizzing sign language, and text to speech capabilities.
How we built it
We built the bot logic using Azure’s Bot Framework Composer. We integrated the Bot Framework Composer with Azure’s language understanding service LUIS to process user inputs and control the flow of conversation with the bot. With LUIS connected, we tested out the bot using the Bot Framework Emulator. Once we were sure that the bot started functioning appropriately, we published the bot onto a webapp. After publishing the bot to a webapp, we created a bot channel registration and linked the endpoint of the webapp to it. After doing that, we tested the webchat on the bot channel registration to ensure that the bot was functioning properly on the user interface. Once we confirmed, the UI was good, we attached a direct speech line to it. Then, we created a static HTML website using the direct line speech credentials to speak to the bot. Finally, we hosted the bot on azure blob storage.
Challenges we ran into
While building out the bot logic using Bot Framework Composer was straightforward, we had many difficulties with the publishing and deploying the bot. It was difficult to fill out all the publishing fields with the appropriate keys and names. Upon further investigation, we also found that the deployment process for Composer is new and still being worked on, so finding a solution to get the bot deployed on Azure was definitely the toughest challenge we faced.
Accomplishments that we're proud of
We are proud of the fact that we created a very useful assistant that can have an immediate impact on helping people learn and communicate sign language.
What we learned
Through doing research on ASL learning, the Deaf community, and the dialect of BASL, we have learned about how widely diverse and passionate the Deaf community is. ASL is a wonderful language that unfortunately is not well known, as is BASL. By creating this project we have learned that there are opportunities for individuals in technology to be able to create tools to support both of these communities in their endeavors to learn ASL as well as highlight the diversity in the Deaf community. Additionally, we have learned about the difficulties to create a project of this caliber. While practically speaking, ASL Helper seems intuitive, to create a project of this scale required a great deal of research, feedback, and hard work. However, in addition to learning about the work required to create a project like ASL Helper, we also learned how rewarding it is to create a tool like ASL Helper. We envision a future in which, regardless of background, individuals can use ASL Helper to fluently communicate in ASL and BASL.
To test our bot we have two different User Interface people can use: a webchat and a website hosting the webchat. The website hosting the webchat is slower in responding but has speech capabilities. In order to test the webchat, click here. In order to test the website hosting the webchat, click here.
Also, when loading the bot at the beginning, the bot may take a second to start up or may need an input such as a "..." for the bot to start working.
Envisioned Use Cases for Education
We envision that this bot will be able to help students who are deaf better adapt to their lives. In terms of immdiate effects on the student, using the speech->text->sign language, deaf students can follow along lectures and lessons from classes with the immediate sign language translations.
Deaf students can also communicate to other non-deaf students using our bot's translate and speach functions. For example, a non-deaf student may say something, the bot catches that and shows the deaf student both the word and sign langauge equivalent of what was said. Then the deaf student can type his response and use the speak functionality to respond to the non-deaf student, creating a seamless converstation through the bot functionlities.
The bot's easy to use UI can also help young deaf students and kids quickly learn sign language more quickly in their childhood. Also for non-deaf people who want to learn sign language, our bot also provides them an easy avenue to learn it, which would increase the overall experience for deaf people as they have more people that are able to communicate with them in sign language.
Finally, this bot also aims to help the overall education experience of other dialects of ASL, such as BASL, by shining a light on BASL and teaching it. We believe that just as how a student should not be isloated because of their accent, a student should not be isolated by their dialect of ASL, and we believe that shining a light on the existance of these dialects will improve the lives of students who use different dialects of ASL, as this allows people to realize the existance of these dialects and adjust interactions with the students choice of dialect appropriately.
What's next for ASL Helper
We see a lot of future potential for ASL Helper and a lot of changes that could improve the bot. The first thing that we would like to do is try and reach out to organizations with online libraries for sign language and see if we could link our bot to their online library. This way our bot translations will be more accurate and have a larger library available for translating, quizzing, and teaching sign language. We also would like to reach out and receive feedback from people knowledgeable in BASL in order to develop the bot in a way to better integrate BASL.
Because of time constraints and the nature of the pandemic, we were not able to test our bot out with people who are deaf and receive appropriate feedback for our bot; because of this, in the future, we would like to try it out with people who are deaf and see their responses and adjust our bot according to their feedback.
Finally, we would like to make an improvement to childhood learning by adding pictures to the the bot that show the word being taught or translated. So for example, if apple is the word being translated, we would show a picture of an apple in addition to the sign language. In this way, children can visualize the words that they are learning.
We would like to acknowledge that ASL helper wouldn’t be possible without several resources for instance we would like to thank Nakia Smith for her BASL user videos, Carolyn McCaskill for her BASL user videos, and BIll Vicars for our American Sign language video