This project is for the Paw Patrol Track (Social Justice)

Inspiration

Understanding emotions can be a challenging task for some individuals. Certain people, especially those on the autism spectrum or those with a social communication disorder, may face unique obstacles in understanding the emotions of the people around them. EmoQuest takes the emotions of real people in its users’ everyday lives and guides them to recognize even the most complex of them, creating a supportive and engaging learning experience.

What it does

The first thing players see when they open EmoQuest is a prompt to upload photos of the friends and family that they see every day. It then uses a computer vision technology called Deep Face to create a gamified experience that challenges the player’s recognition of their loved one’s facial expressions. As they improve their recognition ability, players level up and are guided to identify increasingly subtle emotions, building their emotional intelligence skills along the way.

EmoQuest is built with Python and uses a computer vision library called Deep Face that has the power to recognize the emotions displayed in pictures. Using that data, we built a multiple-choice game similar to Quizlet’s “Learn” feature.

How we built it

Backend To identify the emotions that a person is expressing, we used Deep Face, a computer vision library. All of our game logic was implemented in Python.

Frontend We drafted an intuitive user interface using Figma, then transferred our ideas to the frontend using Tkinter. We created a homepage where users can upload their pictures as data to be displayed in the ‘game mode’ and to be sent as data to Deep Face. It also routes to the ‘game mode,’ which then populates the emotion database.

Challenges we ran into

During the development process, we faced several challenges. Firstly, we had to learn a new library Tkinter, which was unfamiliar to all of the team members. It was difficult to create a modern UI with a simple library. Secondly, we had to further our knowledge of how to incorporate various packages and modules into our project, keeping track of dependencies. Additionally, we had to overcome the difficulty of combining our front-end design with our back-end logic to ensure a seamless user experience. Lastly, learning how to create a game and the many components required to develop a fully functional platform was a significant hurdle. Despite these challenges, we were able to work together as a team, learn from each other, and ultimately deliver a successful project.

Accomplishments that we're proud of

We are extremely proud of the progress we have made during the development of our project. Despite facing numerous challenges, we have learned to use Tkinter, successfully implemented a basic version of the frontend design and user experience, and got the game's basic flow working. Working together, we were able to maintain a well-organized and efficient team dynamic, with everyone contributing their unique strengths and skills to ensure a productive finish. We are confident in the potential of our game to make a meaningful impact on the lives of those on the autism spectrum, and we can't wait to see it come to fruition.

What we learned

Through this project, we gained valuable experience in communication, problem-solving, and teamwork. We learned how to work with new people, manage our time efficiently, and divide tasks based on individual strengths. We also gained technical knowledge on using DeepFace and Tkinter, importing libraries, and developing an application that works toward a good purpose. Overall, we learned a lot through this project, not just about coding and development, but also about working collaboratively toward a common goal.

What's next for EmoQuest

Our next steps would be to incorporate a progress bar that lets players ‘level up’ after a minimum amount of questions and a certain progress percentage. Higher levels would challenge users to identify increasingly complex emotions using a ‘multiple answer’ format and would incorporate new images with more subtle facial expressions. We would also like to create an additional game mode where users can develop and hone their own nonverbal communication skills with a built-in live camera.

An even bigger upgrade would be to create opportunities to build emotional intelligence in real-world scenarios, expanding upon the picture-based model that we have now. One idea is to create an Omegle clone or Zoom widget that can immediately identify and display the emotions of the person that the user is talking to, allowing users to connect their friends’ facial expressions to emotions during natural conversation. This ‘Learn by Doing’ approach would tremendously improve our ability to guide players in building their emotional intelligence skills in an environment where they are empowered to identify the emotions of people who are directly interacting with the players.

Built With

Share this project:

Updates