Inspiration
We’ve all been there: standing in the skincare aisle, staring at a bottle with 30 unpronounceable ingredients, wondering, "Is this actually going to help me, or is it going to break me out?"
Brands make big promises on the front of the bottle, but the truth is always hidden in the tiny text on the back. Unless you have a degree in dermatology, it's almost impossible to know what you're really putting on your face.
We built SkinGraph because we wanted a "dermatologist in our pocket." We wanted a tool where you could just snap a picture of an ingredient list and instantly know if it's safe for your specific skin type, if the brand's marketing is actually backed by science, and if there are cheaper products that do the exact same thing.
What it does
SkinGraph takes the guesswork out of skincare using Amazon Nova. You either type in a product name or upload a photo of an ingredient label. If you upload a photo, we use Amazon Textract to rip the text from the image, and Amazon Rekognition to detect what kind of product it is.
Then, Amazon Nova goes to work. It breaks down every single ingredient, scores the product from 0-100 based on your personal skin type (Oily, Dry, Sensitive, etc.), and flags hidden red flags like pore-cloggers or irritants. It even does a "Brand vs. Reality" check to see if the marketing claims actually match the science of the ingredients. Finally, it uses semantic search via Bedrock Embeddings to suggest cheaper alternatives with similar ingredient profiles.
All of this is wrapped in a clean, mobile-friendly React frontend and a Capacitor-based Android app, with a dashboard that saves your profile and analysis history securely using Supabase.
How we built it
SkinGraph is built on a modern, decoupled stack:
- Frontend: React 19, TypeScript, and Vite, deployed on Cloudflare Pages.
- Mobile: We wrapped the frontend using Capacitor to create a native Android app that loads the production site, meaning updates push instantly to the phone.
- Backend: A fast Python (FastAPI) server hosted on Render.
- Database & Auth: Supabase handles our PostgreSQL database, pgvector for semantic caching, and full Google/Email authentication.
- The Brains (AWS):
- Amazon Bedrock (Nova 2 Lite & Act): Powers the core ingredient analysis and brand claim verification.
- Amazon Bedrock (Nova Embeddings): Creates vector embeddings of product profiles so we don't have to re-analyze identical products, saving massive API overhead.
- Amazon Textract & Rekognition: Handles all the OCR and computer vision for label scanning.
- Amazon S3: Stores the uploaded label photos temporarily and archives the generated analysis reports.
Challenges we ran into
Getting AI to be accurate with medical/skincare advice is tough. Early on, the model would hallucinate or give generalized advice. We had to heavily prompt-engineer Amazon Nova to act strictly as a cosmetic chemist, forcing it to evaluate ingredients against specific skin types rather than giving generic "this is good/bad" responses.
Another huge challenge was the OCR pipeline. Skincare bottles are cylindrical, reflective, and covered in tiny text. Getting Amazon Textract to cleanly pull a comma-separated ingredient list from a warped, glossy photo on a phone camera required a lot of trial-and-error with image preprocessing and fallback logic.
Accomplishments that we're proud of
- The speed and depth of the analysis. Going from a raw photo to a fully formatted, beautifully UI-rendered scientific breakdown in seconds feels like magic every time we use it.
- The "Brand vs Reality" feature. Having Nova actively call out brands for misleading marketing based on the actual ingredients is incredibly empowering for the consumer.
- The Android App integration. Getting the Capacitor wrapper working seamlessly with Supabase Auth env variables right at the end of the wire was a huge win.
What we learned
We learned that prompting a foundation model for medical or scientific advice is entirely different from prompting it for creative tasks. We had to learn how to explicitly constrain Amazon Nova so it wouldn't hallucinate or give generalized, overly cautious advice, but instead act as an objective cosmetic chemist analyzing raw data.
We also learned a lot about how complex optical character recognition (OCR) can be in the real world. Skincare labels are notoriously awful for OCR, they're wrapped around tiny bottles, the text is tiny, the plastic is glossy, and the contrast is usually terrible. We learned how to write robust fallback logic to handle the messy, fragmented text dumps that Textract sometimes returns, ensuring the app still finds the ingredients even when the photo isn't perfect.
What's next for SkinGraph
We want to take the "Routine Compatibility" feature to the next level. Right now, it checks if two products clash. In the future, we want users to take a photo of their entire bathroom shelf, and have Amazon Nova build them the perfect optimized morning and night routine, telling them exactly what order to apply things in and what to throw away.
Built With
- amazon-bedrock
- amazon-nova-act
- amazon-nova-lite
- amazon-rekognition
- amazon-textract
- amazon-titan-embeddings
- amazon-web-services
- capacitor
- cloudflare
- fastapi
- pgvector
- postgresql
- python
- react
- render
- supabase
- typescript
- vite
Log in or sign up for Devpost to join the conversation.