Inspiration

There is a recent trend where Instagram users use facial recognition to find out randomly associated attributes. We decided to do something similar by porting this functionality to the web.

What it does

It takes a photo of the user's face and tries to extract meaningful data from the image. Examples of this data include age, gender and facial expression.

How I built it

We built it using the Gatsby framework. For facial recognition, we used face-api.js for local near-instantaneous recognition to ensure the existence of a face in the image before capture. The screenshot is then sent to Azure’s Cognitive Vision Services to get higher-quality facial data. Our app then analyses the data retrieved to compute your personality.

Challenges I ran into

We had a hard time trying to use face-api.js library to detect the presence of a face before taking the screenshot. The API was somewhat confusing and it was also our team’s first time working with such facial recognition libraries.

We also faced difficulties sending image data to Azure cognitive services. The API documentation was lacklustre.

Accomplishments that I'm proud of

Thanks to our designer's efforts, our web application has a clean user interface and is rather simple to use. A user is able to take the personality test without much effort. A single click is all they need to find out their personality.

What's next for Byers-Miggs Personality Test

We intend to create more personality spectrums and possibly include audio analysis for increased accuracy.

It was a fun experience building a parody of a famous personality test.

Built With

Share this project:
×

Updates