Inspiration

The sad and busy life of graduate students.

What it does

plays music based on emotions and control playback of music using hand gestures.

How we built it

We used python ML libraries(FER, Facial Recognition, and MediaPipe) and based it on a Web app with a Flask framework in the backend.

Challenges we ran into

Lack of sleep, integration issues, creation of gestures

Accomplishments that we're proud of

First-time hackathon participation. We worked and collaborated for 24 hours straight.

What we learned

Implementing new frameworks and using new libraries is fun.

What's next for MEGAN - Music by Emotion, Gestures Augmented Navigation

  • The first direction is to complete the integration of both components.
  • Second is to move on from mood capturing from image to do the same using videos.
  • Redirect users to songs hosted on a public website like YouTube or Spotify rather than have them saved on the user system.
  • Allow users to give feedback on the song selection and incorporate this feature in future recommendations.
  • Expand spectrum of hand gesture recognition.

Built With

Share this project:

Updates