Inspiration

Have you ever had trouble telling what another person was feeling while talking to them? Have you ever offended someone because you didn't know how to read the situation?

We wanted to create a solution to help solve this problem.

What it does

Our app, Emotion Detector – A Handy EQ Helper, can help you in times of need! All you need to do is pull out your phone, take a photo of the person, and our app will take it from there. Emotion Detector uses Google Cloud's Vision API to accurately detect emotions of the person in the photo, and uses that information to suggest actions that you could take in those situations. Emotion Detector also has a gallery feature, where you'll be able to remind yourself of those good times by seeing all the previous photos you've taken, if you choose to save them.

How we built it

We built our app using a React Native front-end. Our back-end storage solution uses Cloud Firestore and Cloud Storage from Firebase. In order to analyze images, we use Google Cloud's Vision API to gain emotion detection data.

Challenges we ran into

We ran into many debugging issues with asynchronous functions in React Native. We also had to figure out how to use the Google Cloud Vision API, along with Firebase's APIs. We also ran into some issues regarding cross-platform management and package managing.

Accomplishments that we're proud of

We're proud that we were able to implement a machine learning solution inside of a mobile application in just one weekend! We're also proud that we've learned how to implement and use new technologies like React Native, Firebase, and Google Cloud Vision API.

What we learned

We learned how to use React Native, Firebase, and Google Cloud Vision API

What's next for Emotion Detector – A Handy EQ Helper

  • Implement a functionality where we could scan previous images.
  • Enhance the gallery to provide more tools like sharing, editing, and cropping etc.

Built With

Share this project:

Updates