Preface: "Know your products before you use them on your face"


We all know the frustration of opening up a brand new skincare product, only to use it for the first and last time. Have you ever struggled knowing what products to use, pronouncing certain ingredients on the label, or finding reliable products that won't trigger an allergic reaction? With all the claims and obscure packaging, we hardly pay attention to the ingredients list. What exactly is Cocamidopropyl Betaine, Coco-Glucoside, or Sodium Lauroyl Methyl Isethionate? More often than not, we end up wasting money on these products that don't work for our skin needs or skin types. Worse, we may end up harming our skin for the worse. If we want to achieve that flawless, glass skin, we need to be more aware of what ingredients and chemicals in our products before putting them onto our skin; hence the name, Preface.

What it does

Preface is a centralized platform that allows users to get a custom skincare routine, with an emphasis on transparent ingredient lists. Our platform analyzes the top products and displays its claims and ingredients, making it easy to understand for the average person. This custom-curated skincare routine takes into account the four main skin types, dry, oily, combination, and normal skin, as well as skin needs. These include acne, redness, and aging concerns. Using our proprietary technology, customers can receive recommendations for the optimal five-step routine: cleanser, exfoliator, toner, serum, and moisturizer. By consolidating this information in one place, consumers can make wise and efficient decisions on their skincare products.

How we built it

Using the EWG Skin Deep Database, combined with Sephora's cosmetic catalog, we were able to webscrape necessary information regarding a product's description, claims, and ingredients. Knowing the user's skin type and needs, we parse through the Sephora catalog for products matching these preferences. The top product that matches the user's needs, will be picked for the user's routine. Then, we scraped the product's ingredients from the EWG database and pulled the description, functions, and concerns for each ingredient. The chrome plugin, scrapes the EWG database and exports a JSON file for all products in the five categories: cleanser, exfoliators, toners, serums, and moisterizers. With the chosen product, we parse the JSON file to find the product and ingredents. We created our website using HTML, CSS, Javascript, and the bootstrap framework. We use simple language and icons to convey all the necessary information on our website and make it more understandable for the average person.

Challenges we ran into

Initially, tried to edit bootstrap’s framework for a navigation but that was difficult to create inner drop-down in parent dropdown/multi-level dropdown. We searched endless blogs and tutorials for this but ended up coding the entire dropdown navigation feature without a framework using jQuery and html. Additionally, we had to figure out conceptually how to implement our ideas. We discussed and drew MANY abstract diagrams about scraping databases and websites (in addition to figuring out which ones would serve our purpose would be best) to create a useful centralized skincare platform for a user. We tackled the challenge of figuring out how to connect data that we were collecting, parsing the useful pieces of information, and displaying it in an aesthetic, easy-to-understand manner to the user. We struggled figuring out how to incorporate google cloud services to store the data that we are scraping, but plan to incorporate this in a future update.

Accomplishments that we're proud of

We were able to webscrape information from two different websites, an e-commerse site and ingredient database. As beginner front-end developers, we refined these skills by creating a fully-functional multilevel page with navigation. We connecting local website to server (this was our first time working with hosting a site on a 3rd-party purchased domain!)

What we learned

We learned how create a sitemap of a website's pages and create an automated webscraper that could walk through a page's links and buttons in our preferred order. We learned how to understand these page's HTML elements and layout in order to extract the specific data we were looking for.

What's next for Preface

As of right now, the custom skincare routine only takes into account one user preference, which is skin type. We want to expand the possibilities for customization by allowing the user to take a quiz for the most specialized product that matches their needs. Some of those customization would include concerns such as dark spots, sensitivity or ethical ingredients. We also would like to further personalize a routine by including a 3-step or even 12-step routine for beginning to even advanced skincare advocates.

We also plan to utilize a proper backend using Google Cloud Services (Google Firebase) to better access our data. Our webscraper returns our data in JSON format, so we would like to automate or schedule the webscraping and push that to Google Firebase so we don't have to store the JSON file locally. With each object being a single product, whenever the user submits their preferences, we would fetch the corresponding product and ingredient data from the cloud to display.

Share this project: