Inspiration

  • We were inspired by multiple different AI-based filters and apps that we have seen an increase in recently. In addition to that, we were driven by a passion for helping those who may struggle with emotion identification, such as neurodivergent individuals.

What it does

  • The app connects to the user's camera, takes an image, and then, using the AI model, tells the user what emotion it believes was being expressed in the image

How we built it

  • We built this by saving the trained convolutional neural network into a model that can be accessed by the app
  • The app uses the Kivy framework in order to create a user interface
  • We divided the work into frontend, backend, and neural network creation
  • Matplotlib was important in creating the mood log pie chart
  • Camera and storage access of the device were important in saving the images for the gallery screen

Challenges we ran into

  • We ran into a lot of difficulty with packaging the app
  • We suspect because of splitting up the work, it was difficult to coalesce into one project that works well with each other.

Accomplishments that we're proud of

  • The way the app looks and functions! It is a very simple and easy to use design for a very complicated process
  • We are proud of how much we learned about neural networks to create an AI with around 80% accuracy
  • We are proud of how we worked together so well!

What we learned

  • We learned a lot about different ways python can be used!
  • How important effective team communication is
  • How apps are made at a base level
  • How AI is created and trained

What's next for MoodSense

  • Getting it packaged as an app!
  • Allowing users to create profiles with Firebase authentication
  • Connecting to videos/image libraries to identify emotions that way

Built With

Share this project:

Updates