Inspiration

The beauty industry is ever evolving with the emergence of new technology and Augmented Reality is the next step in this direction. However, specific applications of the industry require an overhaul to accommodate all people with various skin complexions. Current ways of matching foundations are very archaic including using matching apps or physically applying them yourself with trial and error. These methods waste the money of consumers as well as make the industry less inclusive. We know that machine learning is a viable way to help solve this problem through automation and AR is an applicable industry where this technology is already being used.

What it does

Pigment Match's goal is to allow for users to discover foundations that they can apply without mixing multiple products together. This would decrease the time and overall cost of users looking to use facial makeup. Essentially, this is a recommendation agent for foundation agents that can match skin types as well as provide additional resources such as videos of influencers using the product.

How we built it

Our product uses deep learning to train a model under an Arduino framework capable of categorizing skin tones. Using this finding, our model recommends various foundation matches which can be visualized using Spark AR. The Android application is a compact way of merging the two fields capable of taking pictures and using the developed Pytorch framework to recommend the ideal foundation on Spark AR.

Challenges we ran into

Training an effective model in such a short time frame was the largest obstacle that needed to be cleared. In conjunction with this, cleaning the dataset was necessary to produce meaningful output for the cosmetic visualization. On the Android development side, operation of the camera to send our pictures to the model was the biggest challenge.

Accomplishments that we're proud of

From the inception of the project idea to current progress, the set time limit was a severe handicap for our goals. Training an accurate model while simultaneously learning Spark AR was a herculean task from the get-go but after perseverance and a tireless work ethic, we managed to complete our MVP with time to spare.

What we learned

The Spark AR development environment was entirely new to our group and we wanted to incorporate visualization technology to show the tangible effects of our model training. The development environment was straightforward and easy to pickup resulting in the seamless use of AR cosmetic overlay.

What's next for Pigment Match

For product improvement, we will work on integrating the Android aspect to the Spark AR directly through an app interface which was in our original design, but couldn't be completed for our MVP. Deploying the Pytorch model on the Android app is our immediate next step. Looking ahead, we're aiming to partner with beauty and cosmetic companies looking for ways to apply our learning model to streamline the search for potential foundations/makeup from their endless database of resources.

Built With

Share this project:
×

Updates