Inspiration

Deaf have their own language of communication (sign language), they can only understand this language and also they write and read Faster on sign writing. Deaf did not have access to most of facilities as much as ordinary people,Over 5% of the world's population – or 466 million people – has disabling hearing loss (432 million adults and 34 million children). and The natural language of around 500,000 deaf people in the US and Canada is American Sign Language (ASL). For many people who have been profoundly deaf from a young age, signing is their first language so they learn to read and write English as a second language . As a result, many deaf people have below-average reading abilities for English text and prefer to communicate using sign language A comprehensive approach to the task of enabling humans who cannot sign to communicate using sign-language would clearly require the development of a general purpose speech to sign language converter. This in turn requires the solution of the following problems:

  1. Automatic speech to text conversion (speech Recognition).
  2. Automatic translation of English text into a suitable representation of sign language.
  3. Display of this representation as a sequence of Signs using computer graphics techniques.

for now our focus is on solving second problem

the most commonly used sign representation is the international sign writing developed by Sutton Movement link

  1. it is aimed at enabling us to write every sign or signed sentence of every country;
  2. because the writing is pictorial, natural shapes and movements of signs can be realistically shown; 3.face expressions and body movements can be depicted, too.

Sign Writing example

Using International Sign writing,deaf can write and read any sign language in the world, most of deaf feel very comfortable With it, the main problem is that there is not any computerized translator to translate English sentences to sign writing.

What it does

AVA is an experimental system that aims to translate English sentences to sign writing. the System takes an English sentence from user, translate it to sign writing (1D - computer understandable) using machine translation and then convert it to 2D and shows its result. AVA translated the sentence "i got it" to ASL sign writing (1D) Sentence "i got it" in Sign Writing (2D - human understandable)

How we built it

Dataset: First of all we gathered English-sign writing documents from different source like sign writing Wikipedia , sign bank website ,then we extracted English- 1D sign writing sentences pair and store it in a file data.txt. computer can easily understand 1D sign writing the pairs have this format: English - sign writing data set format

Note:as a demo version we only prepared "1095" pair of small sentences

Model Creation in pytorch

we build a sequence to sequence nn model for translating English to sign writing by following this tutorial the code that we used to create and save our model(encoder , decoder) is available in this jupyter notebook.

Deploying the Model using Flask

We deployed our model using Flask (app.py) and created two html files index.html and 2d.html,user types an English sentence in "index.html" and click on translate button,the model will generate the "1D sign writing translated sentence or formal sign writing",that is not much human understandable, to generate the 2D form of translated sentence user can click on "view in 2D" button the second file (2d.html) will be opened and users can see there result in 2d form. the 2D form is the writing form that most of deaf use daily. in 2d.html we used the java script code developed by "Steve Slevinski" to change 1D (result generated by our model) to 2D .

Challenges we ran into

1.Gathering data set was very challenging,we spend lot of time for understanding sign writing, finding sources for gathering dataset ,extracting sentence from these sources,changing there format (2D to 1D) and preparing data set for training. 3.when creating the model and deploying it in flask we face a few small challenges but we figured how to solve them. 2.We tried to host our flask app in Microsoft azure cloud services but we run into many problems the app works on our local computer but not in cloud. :( we will figured out in future.

Accomplishments that we're proud of

We proud that we developed a concept that can be used to develop fully functional applications and websites for deaf. as a result deaf would be able to use there own language to communicate with ordinary people.

What we learned

we learned the process of creating,deploying and hosting a pytorch model,also we are know familiar with sign language and sign writing ;) .

What's next for AVA - toward a Machine Translation Platform for Deaf

First of all we will improve our data set, our plan is to create a large data set (more then 20000 sentences) and use it to create a fully functional "website+mobile app" machine translator to translate any English sentence to sign writing and vice versa. next we will create a mobile application to help Deaf understand ordinary people thoughts,This in turn requires the solution of the following problems:

  1. Automatic speech to text conversion (speech Recognition).
  2. Automatic translation of English text into a suitable representation of sign language. - (using AVA ML model)
  3. Display of this representation as a sequence of Signs using computer graphics techniques. (a sign Avatar)

The system will get the text/speech from smart phone, convert it to sign language using machine translation and represent the signs using a 3D character. We will also work on other sign languages as most of sign language in the word use sign writing.

Share this project:

Updates