Inspiration
When I (Christian) saw that I could compete at the intersection of AI and Accessible Tech I was so pumped. My mother works at Nemours Children's Hospital and she also volunteers her time working with Special Needs children and young adults. So naturally I called her up and asked, about what groups of individuals are not having their daily needs met by technology. A long conversation ensued, but a two big takeways were:
- Adaptive Tech is TOO EXPENSIVE, which means its not accessible to most.
- Non-speaking individuals can often connect their thoughts to images before they connect them to words.
What it does
Self-efficacy is the ability to advocate for oneself. ChitChat allows Non-Verbal users to do this using recognizable word icons that output as full sentences. The magic happens when our app turns pieced together english into eloquent sentences and phrases. Additionally, it outputs natural sounding english, not robot.
How we built it
Every other AAC (Augmentative and alternative communication) and SGD (Speech Generation Device) uses proprietary language processing software that is extremely expensive and inaccessible. Starting from the ground up, we utilized OpenAIs GPT3.5 API to do the computing and language processing faster and at a fraction of the cost. Finally, we used OpenAIs TextToSpeech API which allows for natural, human sounding responses.
Challenges we ran into
It is hard to build a comprehensive word/icon dictionary from scratch. However, with the select keywords and word categories we included, there are thousands of permutations for communicating daily needs.
Accomplishments that we're proud of
Honestly, we are surprised by how well ChatGPT features worked in our backend. Millions of users interact with the OpenAI "chat bot" every day, but we are excited to see it harnessed as the "engine" behind our word processing.
Additionally, we're proud to beat the competition at their own game. The industry standard devices are well over a $1000, and the iPad app versions will run you about $250.
What we learned
Well for one, we became subject matters on Autism Spectrum Disorder, Aphasia (loss of speech from brain damage), and many speech/language disabilities. We really focused on catering our UX to the actual end-users, which required knowledge on how people with Non-Speaking Autism communicate, interact with technology, and process emotions.
What's next for ChitChat
We feel like we've just opened a can of worms to a very exciting future for Accessible Technology. We would love to see more experiments like this, more people like us passionate to use NLP to increase the quality of life for others. If ChitChat gains traction in the Adaptive Software community, we would love to see how far it will go as an OpenSource application.
Built With
- gpt3.5
- nextjs
- node.js
- react
- shadcn
- tailwind
- typescript
Log in or sign up for Devpost to join the conversation.