Inspiration
According to ANNALS academy of medicine Singapore statistics, the universal newborn hearing screening results showed that 3.8 per 1000 newborns have hearing loss. Therefore, assuming a birth rate of 40,000 per year, there are nearly 3000 young children 19 years of age and below born with hearing loss in Singapore. This figure excludes many more with acquired hearing loss. Hearing impairment causes miscommunications with others, and prevents them from participating in most activities in society. It will also make it hard for them to find work, make friends, and it could be a fault line in society. (ref: http://www.annals.edu.sg/cpdMay05.html)
Therefore hearing-impaired children (HIC) need to have Cochlear Implants done during the early stage and the doctor will then suggest they to undergo proper training at the therapy center to improve their hearing and speech. According to Olivia Wee, Senior Auditory-Verbal Therapist (AVT) in Singapore General Hospital, for the training to be as a whole, the families must also be committed to constantly communicating with their hearing-Impaired children so that they can learn to listen and speak more, this is an essential part of their development process.
However, according to Deaf Singapore, sign language is currently the most suitable communication method for the deaf due to the current limited medical and education resources. In addition, some parents are not conversant in English, they neither have time to accompany their child to practice, nor afford and pay for their child to visit the therapist frequently.
Such consequences will cause the deaf and the hearing to have different languages, be unable to understand and care for each other, and not share ideas, common values, and experiences with one another.
What it does
As an effective measure for hearing-impaired children, we came up with a creative approach -- we designed a program called DeafTalk which consists of an application that provides tutorial & practices, application game called “Jump-For-Fun” and a physical game called Cube-Hub. DeafTalk uses speech recognition that accurately corresponds recorded voice to the original audio of the pronunciation to check pronunciation accuracy, accompanied with the waveform to show exactly where is the pronunciation error and how far away is it from the correct pronunciation, and uses mouth shape video recognition to analyse the changes in the shape of the mouth recorded into video. This application also consists of an engaging voice interactivity game. Cube-Hub trains hearing-impaired children to learn Phonetics via arranging cubes.
DeafTalk can be used to conduct an adult assisted (Therapist or Parents) or completely self-directed learning session at home. The lowered cost occurred can benefit most families and through this system, the HIC can learn to speak effectively, which is beneficial for building an inclusive and harmonious society among the diverse communities of the hearing and the deaf.
How we built it
- The Web Portal designed for AVT to prescribe and track the progress of HIC
- The Mobile App designed for HIC will
- Remind them to perform the prescribed exercises.
- Guide in performing the exercises through e-tutorial video, 3D mouth pronouncing animation and mouth shape correction.
- Uses phonetic & speech recognition that accurately corresponds recorded voice to the original audio of the pronunciation to check accuracy, accompanied with the waveform to locate pronunciation error and show the gap between wrong and correct pronunciation.
- Uses mouth shape video recognition that determines the user's pronunciation accuracy based on the standard pronunciation model. The visual feedback illustrates outline of different colors (red and green) to show the correct and wrong parts of the mouth shape when the user is pronouncing.
- 3D Voice control game “Jump for Fun” to encourage them practice and enhance the memory of their pronunciation.
- The Cube-Hub designed for HIC to arrange the digital and interactive phonetics cubes to form various words, to reinforce their learning in vocabulary and phonetics.
- The System will be hosted on AWS and supported by the TiDB database.
Challenges we ran into
- Development difficulty, we are the first to use phonetic speech recognition to help HIC, improve the accuracy of learning pronunciation, and we have adopted a lot of native AI technology.
- We have taken many tests and contacted many professionals to give us advice and support.
- This is a very challenging project development, we are paying the technical support we should give for a group of people worthy of care.
Accomplishments that we're proud of
- We have built a stable product system powered by TiDB, including a mobile application with Phonetic Speech Recognition, and lots of gamified features. a web portal dashboard for assisting the Auditory-Verbal Therapist.
- We have made over 500 testers to try and help us test our system and algorithm, and the feedback is really good and helpful.
What we learned
The technologies we learn can really change lives and serve society, which is beneficial for building an inclusive and harmonious society among the diverse communities of the hearing and the deaf.
What's next for
- build healthcare based AI Chats, help to hearing-impaired children or other persons with the consultation barrier.
- we will settle down a great business plan for software-as-service. and publish and let more people try it out.
Log in or sign up for Devpost to join the conversation.