Moodify – Your Mood. Your Music.
Inspiration
The idea for Moodify came from the simple struggle of finding music that matches how we feel. We wanted to create an app that could understand emotions and play the perfect soundtrack — no endless scrolling, no guesswork.
What it does
Moodify lets users select or describe their mood, then recommends a playlist that aligns with that emotion. It bridges the gap between how we feel and what we hear, instantly curating music to match the vibe.
How we built it
- Frontend: React and Tailwind CSS for a sleek, responsive design
- Backend: Flask for routing and mood-matching logic
- APIs: Spotify Web API to fetch real-time playlists
- Mood Detection: Keyword-based sentiment analysis for text input
Challenges we ran into
- Mapping vague user inputs to specific moods accurately
- Handling Spotify’s authentication and API limits
- Designing a UI that’s both minimal and expressive
Accomplishments that we're proud of
- Built a fully functional, mood-based music recommendation app
- Created a smooth and engaging user experience
- Integrated real-time music playlists successfully
What we learned
- How to connect frontend and backend seamlessly
- The importance of user feedback in refining mood detection
- Better API handling, UI design, and state management
What's next for Moodify
- Implement voice-based mood input
- Add machine learning to improve mood classification
- Launch mobile support and user profile features
- Expand music sources beyond Spotify
Try Moodify – Feel it. Hear it.
Log in or sign up for Devpost to join the conversation.