Inspiration
Synesthesia Music and visual art touch people everyday. When we listen to musics, we feel the emotion; when we look at visual arts, we feel the emotion. Therefore, is there a way that we can match music with visual art based on our mixed emotions?
What it does
Synesthesia
- Extract emotion features from music clips;
- Extract emotion features from visual arts;
- Match them! When you play a music clip, it will display visual arts sharing the similar emotions.
How we built it
Music to Emotion: we try to train our own music dataset based on some machine learning models and get vectors. Visual Art to Emotion: we try to analyze emotion vectors from visual art analysis. Match: when you play a music clip, we will analyze it and find visual arts sharing the similar emotions. Emotion analysis is based on Russell's Model.
Challenges we ran into
Music to Emotion Hard to find good dataset. Hard to train and test models. Match Hard to match vectors.
Accomplishments that we're proud of
It partially works now!!!!!
What we learned
The importance of clear and clean code. Improve frontend skills. Basic machine learning skills.
What's next for A Guide to Synaesthesia
Try to complete the whole project. Better ML and better matching.
Log in or sign up for Devpost to join the conversation.