Inspiration
There are 7 billion people on earth and all of them have one common language - the language of emotions. No matter where we go, even if people don't understand the language, they understand emotions. Emotions are the basic essence of human interaction. For most of us, understanding emotions come naturally. We know when people around us are happy, sad, angry fearful and whatnot. However, there are some people for whom interpreting emotions is nothing short of dodging a bullet.
We are talking about our friends who are diagnosed with Autism or anyone who is diagnosed with ASD. As per the Center for Disease Control 1 in every 54 children is diagnosed with autism. Roughly 50% of the people on the ASD spectrum have difficulty interpreting and reacting to emotions.
While there are a lot of solutions out there to help people with disabilities, we couldn't find many for people with Autism. There have been more than 300,000 research papers submitted on emotional analysis in the last decade itself, but many of them didn't see the light of the day. Why?
Well because the cost of the solution is high and not everyone could afford it. Keeping that in mind, we thought to come up with a cost-effective solution that we call Affectionate.
What it does
Affectionate is a mobile app (currently on Android) that works along with a fitness band to assist people on the ASD spectrum to interpret emotions better. The user - an autistic person - has to wear a fitness band which will be paired with a smartphone running our app.
Whenever the user is in a conversation, they can turn the voice recording feature on using the band. The band will record the audio, send it to the phone.
First, we perform sentiment analysis on the audio using Natural Language Processing on the cloud. Second, we use an algorithm on the device to identify the emotion from the sentiment. Third, based on the emotion, the app shows an emoji on the band and the wearer knows almost instantly what emotions are being showcased by the other person.
How we built it
We used the following tools to build it:
- Android - Java
- Azure Cloud NLP API
- Vokaturi Audio API
Challenges we ran into
- Running sentiment analysis on voice and not text.
- Understanding the results from sentiment analysis
- Integrating Vokaturi API with the Android app and using the sentiment to further analyze emotions
- Sending the emoji to the fitness band - this was the toughest.
Accomplishments that we're proud of
First of all, we are happy that we got to learn and understand about Autism. Being in our usual roles at work, we wouldn't have had a chance to learn about it. Secondly, we managed to quickly develop a prototype that does detect emotions to a good extent in less than a week's time. Lastly, all of it was done virtually, all of us got a taste of how virtual teams function.
What we learned
- We learned how to use Azure Cognitive Services and Vokaturi API
- We understood how Bluetooth pairing works and how to send inputs to fitness bands
What's next for Affectionate
The next steps for Affectionate would be:
- FY'22 Q4 - UAT and Beta testing, we have in fact shared this with our Autistic friends for their initial feedback too.
- FY'23 Q1 - Roll out to Android store with support for Mi Band
- FY'23 Q2 - Expand support for other fitness bands
- FY'23 Q4 - Develop MVP for iOS and Apple Watch


Log in or sign up for Devpost to join the conversation.