Scientists say that the lion's share of communication is nonverbal. People suffering from disabilities like Autism or Aspergers can have difficulties with applying and interpreting facial expressions correctly. We set out to help those people improve on those skills in a gamified manner.

What it does

Emojoy allows players to practice facial expressions while playing games with rising difficulty. Starting from basic transitions between a neutral expression and an emotional one up to playing pong controlled with facial expressions with a human opponent online.

How we built it

  1. Detect faces and their expressions using machine learning locally in the browser
  2. Implement the UI and different games with TypeScript & React
  3. Use canvas based animations for playing pong
  4. Implement a server which assings two players for live PvP-Pong action while enjoying the video stream of the opponent.

Challenges we ran into

Setting up the facial expression detection was difficult, as was connecting it to the game. Learning about WebRTC was also not trivial. Some of us have just started learning react.

Accomplishments that we're proud of

We are very proud of our working and running prototype. Also this app is very exciting for non-disabled people, too. So we were able to play a lot and enjoy our own game.

What we learned

We learned a lot about face detection using AI, WebRTC and polished our React skills.

What's next for Emojoy

The user guidance can be further improved, allowing the user to assign facial expressions to the controls of the games individually. This way users can focus on their weaknesses. Also a more sophisticated expression detection model could add more emotions and then increase the value of our app.

Multiplayer demo

Share this project: