People with autism have a difficult time identifying emotions and engaging in social conversations. Our product is designed to help such individuals gain feedback about their ability to maintain eye contact and identify emotion from real conversations.
Our product uses a Raspberry Pi powered by Google's Cloud Vision API to determine the emotions of individuals in the capture and also the level of eye contact. This information is then pushed to a Firebase database which is referenced an iOS app used to display the information.
We ran into several challenges throughout the project such as: implementing Cloud Vision with the Raspberry Pi and designing the iOS app given our limited experiences. However, we are proud that we were able to create a working IOT product that can potentially help people with autism.
We learned how to use REST APIs and integrate it in a product; furthermore, we were able to create a backend using a realtime database powered by Firebase.
We hope to create a more dynamic user interface where users can sign in and track their progress.