Inspiration

The idea aims to primarily help people suffering from skin conditions and do so intuitively whilst onboarding to open utility to users with multiple disabilities to declare.​

What it does

We have a tool to immediately scan and make recommendations based on their unique skin pattern, tone & shade compared to the other non-affected parts of the skin to create a recipe of products and advice. ​

How we built it

The UX prototype was designed using Sketch and the sequence of UX on MarvelApp. The Lo-fi mobile application prototype would be built using a Machine Learning algorithm trained to hold a library of known conditions and recommendations based on products and their ingredients.

Challenges we ran into

Gaining access to medically-proven solutions for known sensitivities linked to the ingredients used in ELC's product catalogue. Expanding on the LF prototype to integrate a basic ML algorithm to develop the technical feasibility of the proof of concept.

Accomplishments that we're proud of

A low-fidelity prototype deployable and expandable to integrate some of the ideated features

What we learned

The current landscape is more reactive towards solutions already benchmarked​. A solution like this puts ELC at the forefront of learning from its users and adequately catering for their needs.

What's next for See Beyond Smart Skincare

To develop an AI-powered skin analysis tool that can detect and diagnose various skin conditions, including those related to accessibility issues such as skin lesions or other visible skin disabilities. This tool could be integrated into Estee Lauder's mobile app or website. It would allow users to take photos of their skin and receive personalized recommendations for specialized skincare products. Some AI models can detect and diagnose various skin conditions, including those related to accessibility issues such as skin lesions.

Built With

Share this project:

Updates