What Inspired Me

I always wanted to make games that can understand how players feel. When I was playing games, I noticed that sometimes I would get frustrated and the game would become too hard, or I would be bored and it would be too easy. I thought, "What if the game could see my face and adjust itself?"

That's how I got the idea for the Flappy Bird Emotion Game. I wanted to create a game that uses artificial intelligence to read emotions from your face and change the difficulty accordingly.

What I Learned

Building this project taught me so many things:

Deep Learning: I used a Convolutional Neural Network (CNN) architecture. The model takes 48x48 pixel grayscale images as input and outputs probabilities for each emotion. I learned about layers, activation functions, and how to prevent overfitting.

Docker: I learned to containerize the application so it can run anywhere. This was important for deployment.

Real-time Processing: I figured out how to capture video from the webcam, process frames in real-time, and send them to the ML model without making the game lag.

How I Built It

The project has three main components:

1. Custom ML Model I trained a CNN model using TensorFlow/Keras. The model architecture:

  • Input: 48x48x1 grayscale images
  • Convolutional layers with ReLU activation
  • MaxPooling layers
  • Dropout for regularization
  • Dense layers with softmax output
  • 7 output classes (emotions)

The model was trained on the FER2013 dataset and achieved around 85% accuracy on the test set.

2. Game Engine I built a Flappy Bird clone using HTML5 Canvas and JavaScript. The game includes:

  • Bird physics with gravity and flapping
  • Pipe generation and movement
  • Collision detection
  • Score tracking
  • Adaptive difficulty system

3. Emotion Integration The game captures frames from the webcam every few seconds, sends them to the ML service, and adjusts difficulty based on the detected emotion:

  • Happy: Slightly harder (player is engaged)
  • Sad: Easier (supportive gameplay)
  • Angry: Much easier (reduce frustration)
  • Fear: Easier (reduce stress)
  • Neutral: Normal difficulty

Challenges I Faced

Model Training: The biggest challenge was getting the custom model to work properly. At first, the model was overfitting and giving poor predictions. I had to experiment with different architectures, add dropout layers, and use data augmentation.

Real-time Performance: Making the emotion detection work in real-time without affecting game performance was tricky. I had to optimize the image processing and reduce the frequency of emotion checks.

Browser Compatibility: Getting the webcam to work across different browsers was challenging. Some browsers have different APIs for accessing the camera.

Deployment: Deploying a multi-service application with ML models was complex. I had to learn Docker and figure out how to package everything together.

Technical Details

The emotion detection works like this:

  1. Webcam captures video frames
  2. Frames are converted to grayscale and resized to 48x48
  3. Images are normalized (0-1 range)
  4. Custom CNN model predicts emotion probabilities
  5. Game adjusts difficulty based on the highest probability emotion

The difficulty adjustment affects:

  • Gravity strength
  • Flap power
  • Pipe gap size
  • Pipe movement speed
  • Pipe spacing

Flappy Flap is Live :

https://flappyface-prod-production.up.railway.app/

Share this project:

Updates