Inspiration
We wanted to create a fun project that users could interact with facial emotion recognition models. We combined our team's interests in ML models, hardware, and origami to bring the project to life.
What it does
Mood Flowers detects your mood from your face using image recognition. The flowers will then light up with a color corresponding to the mood.
- neutral = white
- happy = yellow
- sad = blue
- angry = red
- disgust = green
- fear = purple
- surprise = pink
How we built it
We built it using python (for the facial emotion recognition) and C++ (for the arduino). Using Facebook's Deepface pretrained facial recognition model and OpenCV's haarcascade frontal face dataset, we can detect the emotion of the primary face in the frame. The Arduino is connected to a RGB LED, using pulse width modulation to change the color of the light. The Arduino takes a mood from Serial input (COM12) which is used to change the lights when moods change.
Challenges we ran into
- Running the arduino connection script and the image recognition script at the same time
- We addressed this by combining them into the same script
- Further work would be to revive the web camera component because it currently causes some interference with the Serial port needed to make the arduino connection.
- Data transfer between the python script and the arduino script
- Data transf
Accomplishments that we're proud of
- Getting the facial emotion detection model up and running
What we learned
We learned how to create the We learned how to communicate between python and arduino
What's next for Mood Flowers
- Fixing bugs
- Creating this on a larger scale to potentially be displayed in RISC for visitors to interact with.
Log in or sign up for Devpost to join the conversation.