Inspiration
Many children in the US are on the Autism/Asperger spectrum and have difficulties with things that we might take for granted. One of these things is easily recognizing facial expressions. A Web App that works on any device paired with Microsoft's powerful azure cognitive services, provides us with an opportunity to create an easily accessible platform for them to practice and get a little extra get help with this skill in a fun way.
What it does
The basis of our multi-platform app is the user pointing their phone camera at someone's face. The app then uses Microsoft azure cognitive services, a machine learning API, to detect and display the subject's emotion. The user can chose multiple features, just seeing the detected emotions, playing a practice mode where they first guess the subjects emotion before it is displayed, and take notes associated with various emotions.
How we built it
The client code is written in react.js. It captures camera frames and sends them to the rest API back end. The rest API will then route the request to the cognitive services API. It will also take in any notes on the individual emotions and store them into a MongoDB database.
Challenges we ran into
Getting all the API's and technologies to play nice together. Building a web app in 24 hours. Debugging!
Accomplishments that we're proud of
Being able to build a cool multi featured Web App in a short period of time.
What we learned
The real hackathon was the friends we made along the way. Also teamwork, azure, mongoDB, and react.
What's next for Emote
More Features! Recognizing specific faces and recognizing voices. General polish.
Built With
- azure
- cognitiveapi
- express.js
- faceapi
- heroku
- javascript
- mongodb
- node.js
- react

Log in or sign up for Devpost to join the conversation.