Inspiration
In just 10 years, autism diagnoses have managed to increase by nearly 200%. Individuals with autism often find it challenging to manage everyday activities compared to neurotypical individuals. Moreover, around 83% of autistic adults remain unemployed due to societal barriers and difficulties they face interacting socially. For instance, an uncle of one of our team members has autism and has never been able to maintain employment or relationships effectively, hence, relying instead on family support for the past 25 years. Not everyone, however, has such support. Recognizing this critical issue, we identified a strong need for a device that helps individuals with autism better understand how others express emotions and behaviors during social interactions.
What it does
NeuroSync listens to conversations, analyzes the words being spoken, and identifies emotional cues in real-time. NeuroSync offers real-time guidance by sending an audio feedback to the user's headphones to help them understand emotions and respond appropriately. Additionally, the app contains a gamified exercise feature that allows users to receive demo situations where they interpret the correct emotional cues.
How we built it
We hosted the backend server on Azure and used Python to build the backend, while utilizing frameworks like Flask for handling requests and Ollama for language processing. Speech recognition, audio segmentation, and JSON libraries were integrated to process audio input and analyze speech data. For the front end, we built the app with React Native, ensuring to allow mobile compatibility.
Challenges we ran into
Multiple projects were executed to allow for the most confident reading of the emotions based on mock situations. The accuracy of the emotions picked up by the system was a crucial aspect that required a confidence level of >90% to allow for a definite and accurate response.
Accomplishments that we're proud of
The confidence levels were improved by modifying the program to pick up on sentence cues. Tests were run with a multitude of scenarios for which all generated an accurate (>90%) accuracy.
What we learned
We gained a deeper understanding of the challenges faced by individuals with autism and the importance of creating accessible solutions. Collaborating as a team taught us the value of communication, efficient task delegation, and leveraging each member’s strengths. We also improved our skills in managing time effectively under tight deadlines, ensuring each phase of development stayed on track.
What's next for 99 - NeuroSync
Pitch service to governmental sectors (education, etc), and autism foundations to license the product for subsidized cost to users. Additionally, we plan to conduct mock trials with diverse sample groups to refine the app’s accuracy across different social contexts and ensure it delivers reliable, personalized support for end users.
Log in or sign up for Devpost to join the conversation.