Inspiration

Music has always been a powerful way to influence and reflect our emotions. I wanted to combine artificial intelligence and music to create an app that understands a person’s mood and instantly plays songs that fit their emotions. The idea came from the simple thought: "What if your phone could sense your mood and be your personal DJ?"

What it does

The Mood-Based Music Recommendation App uses your phone’s camera to capture your facial expressions, processes the image through an AI-powered facial emotion recognition model i.e. DEEPFACE and identifies your mood — such as happy, sad, relaxed, neutral or energetic. Once your mood is detected, the app instantly recommends 20 curated YouTube songs tailored to your emotion, each with clickable links for immediate playback. The entire process happens in real time, giving you a personalized music experience in just a few seconds.

How I built it

  1. Frontend Development — Built the Android application using Kotlin in Android Studio, with XML layouts for UI design.

  2. Mood Detection — Integrated the DeepFace Python library for facial expression recognition. DeepFace uses pre-trained deep learning models to classify emotions into categories: Happy, Sad, Relaxed, Neutral or Energetic. Mathematically, DeepFace performs: 𝑓:𝐼 → 𝐸 where 𝐼 is the captured image and 𝐸 is the predicted emotion label.

  3. Computer Vision — Used OpenCV for real-time face detection and image preprocessing before sending to DeepFace.

  4. Backend Integration — Built a Flask REST API in Python to run the DeepFace model and send mood predictions to the Android app.

  5. Music Recommendation — Based on the predicted mood, the app fetches 20 curated YouTube song links, each with clickable playback.

Challenges I ran into

Model Accuracy — Ensuring the DeepFace model could reliably detect facial expressions under varying lighting conditions, camera angles, and facial positions. Real-Time Processing — Ensuring mood detection happens quickly without slowing the app. Cross-Platform Communication — Setting up and debugging the Python–Kotlin data exchange. YouTube Integration — Managing curated song lists and making them easily clickable.

Accomplishments that I'm proud of

Successfully integrated DeepFace AI with an Android app to detect moods in real time. Built a complete end-to-end system combining mobile development, computer vision, and backend APIs. Achieved smooth app flow from splash screen → login → mood detection → personalized song recommendations. Created a playlist mapping system to recommend exactly 20 curated YouTube songs for each detected mood. Overcame technical challenges with lighting, face detection accuracy, and API integration.

What I learned

Integrating AI with Mobile Apps: How to connect Python-based AI models with Android frontends. Computer Vision Basics: Using OpenCV for real-time face detection. REST APIs: Designing and consuming APIs for cross-platform communication. User Experience Design: Making a smooth app flow from splash screen to music recommendations.

What's next for VibeZone

My next goal is to evolve VibeZone into a multi-agent system that not only detects mood but also suggests different types of content beyond music — such as movies, podcasts, or articles — tailored to the user’s emotional state. By introducing multiple specialized AI agents, the system could: Collaborate to improve mood detection accuracy. Recommend a variety of personalized content formats. Continuously learn from user feedback to enhance recommendations. This will turn VibeZone into a versatile AI companion that understands your mood and keeps you engaged with exactly the content you need.

Built With

Share this project:

Updates