Inspiration
Learning how to communicate is one of the most critical life skills that a child needs to learn during their earliest stages of life. Additionally, monitoring a child's mental health, and providing the appropriate emotional support for a child in stressful situations are also crucial needs that parents are sometimes unable to fulfill. Our socially intelligent toy addresses both of these problem spaces by acting as a conversational partner that facilitates and maintains the child's mental health through heart rate detection and auditory cues.
What it does
The toy serves as a chat-bot that is able to converse with child on a daily basis. It's google cloud supported artificial intelligence allows it to not only fulfill basic requests, but also provide context appropriate comments that can help the child learn basic social and conversational skills. Our toy also includes a heart-rate sensor that checks for abnormalities in the child's heart rate, and alerts parents and adjust its conversational intentions accordingly.
How we built it
We build this project using the Google AIY voice-kit. We also used the pulse sensor to measure heart rate. We used the voice-kit to process and generate voice commands and data, and the chat bot to generate emotionally supportive and engaging responses. Using tensorflow and sequence to sequence ML, we attempted to improve upon existing chatbots by increasing the amount of training data they have access to.
Challenges we ran into
We ran into various challenges along the way. None of us have a lot of Machine Learning experience and struggled when we had to come up with the best way to train our chatbot. Furthermore, we struggled while figuring out the most efficient and effective way to parse through the databases that we found.
Accomplishments that we're proud of
The google voice kit is works extremely well and responds to all of our commands. Also we have a really cute unicorn stuffed animal now.
What we learned
We all got to work with a variety of unfamiliar technology during this hackathon. Although not all of our attempts were successful, trying to interface and connect each of these technologies was an extremely rewarding experience.
What's next for Emotional and Social Support Toy
Next, we want to build a speech translation system and further help children develop new language skills. We also want to be able to use Google's facial expression recognition system to generate responses based on the person's emotions. Another improvement we could make is to find ways to feed data back into the speech recognition system on a daily basis to further improve the chat bots' ability to communicate with children. We could also expand its application to healthcare contexts (i.e. hospitals could keep a few of these chatbots on hand to give to children when they don't have access to the emotional support of a direct family member or friend. The chatbot could converse with the child using language that she would understand, and act as a friend during that time of distress).
Built With
- arduinouno
- heartratemonitor
- machine-learning
- python
- raspberrypi
- rnn
- seq2seq
- tensorflow
- voicecloud
- voicekit
Log in or sign up for Devpost to join the conversation.