Inspiration

I built a simple face tracker that could tell if the user was happy by using mediapipe basically placing dots on the individual face. My idea for a more thorough version is using multiple modalities like speech, not just speech to text, but the sounds as well.

What it does

This tool is meant to help practitioners identify emotions better for who they are helping. At times just staying in the conversation can be difficult and it may be hard to recognize emotions. so this tool can be used to actually track the emotions over the timeseries of the interaction/meeting.

How we built it

Challenges we ran into

Accomplishments that we're proud of

What we learned

What's next for ERS (Emotion Recognition System)

Built With

Share this project:

Updates