Glam: Your Personal Beauty Architect
Inspiration
We’ve all been there: you see a breathtaking makeup look on social media and save it, but that’s where the journey ends. Between that "saved" image and your own reflection, there is a massive Execution Gap. We identified three core frustrations:
The "Will it suit me?" Anxiety: Most users hesitate because they can't visualize how a look designed for a specific model's face shape and skin tone will translate to their own unique features.
The "How-to" Black Box: A static photo doesn't reveal the technique. Without knowing the layering, blending, or specific steps, trying to replicate a look often ends in frustration.
The Shopping Guesswork: It’s nearly impossible to guess the exact product textures or shades from a filtered photo. Users end up buying products that look right in the packaging but feel "off" once applied to their actual skin undertone.
We built Glam to turn "I wish I could" into "I just did," using AI to bridge the gap between digital inspiration and personal reality.
What it does
Glam is an intelligent makeup studio that acts as a Personal Beauty Architect:
1.Zero-Risk Visualization: Powered by Gemini 3 Pro, Glam renders the inspiration look onto your selfie with pixel-perfect accuracy.
2.The "Deconstruction" Engine: It breaks down complex looks into a Personalized Roadmap tailored to your specific facial anatomy.
3.Precision Product Matching: Matches the color science of the reference image with real-world products that suit your specific skin undertone.
4.Step-by-Step Progress Tracking: An interactive UI shows you what you should look like at every milestone (e.g., "Base done," "Eyes defined").
How we built it
We pushed the boundaries of Gemini 3 Pro’s multimodal capabilities and Google AI Studio's Tooling to create a sophisticated, three-pillar system:
Visual Transfer (Gemini 3 Pro Image) We developed a custom prompting framework to treat the user's face as a protected canvas. The AI "layers" the makeup style from the reference image while strictly preserving the user's underlying identity, bone structure, and lighting environment. This ensures the result is a "makeup application" rather than a "face swap."
Multimodal Image Analysis Unlike simple filters, Glam uses Gemini to "deconstruct" the inspiration photo. The model performs a deep-layer analysis of: Color Science: Identifying precise HEX codes for eyeshadows, lipsticks, and blushes. Texture Recognition: Distinguishing between matte, shimmer, dew, and satin finishes. Technique Extraction: Reverse-engineering the professional application order (e.g., identifying that a "cut crease" was used).
3.Real-World Grounding (Google Search Integration) To solve the "Shopping Guesswork," we integrated Google AI Studio’s Search capability. Live Product Sourcing: Instead of hallucinating products, Glam uses the analyzed color profiles to search the live web for actual beauty products available on the market. Contextual Matching: The AI filters search results based on the user's skin undertone, ensuring the recommended foundation or palette is a perfect match in the real world.
Challenges we ran into
Our biggest technical challenge was "Identity Integrity." Standard image-to-image models often "hallucinate" new facial features. We spent dozens of hours in Google AI Studio perfecting our constraints to ensure Gemini 3 Pro understands that the makeup is the only variable—the human face is the constant.
Additionally, creating the Interactive Before/After Slider required seamless state management between the AI-generated "After" image and the original "Before" shot, ensuring zero jitter during the transition.
What's next for Glam
AR Real-Time Guidance: Integrate Augmented Reality for real-time "AI Blueprint" overlays.
Community Looks: A social feature to share and follow community-generated "Roadmaps."
Direct-to-Cart Integration: One-click buy for your entire personalized shopping list.
Built With
- googlesearchapi
- nanobanana
- react
- typescript
- vite


Log in or sign up for Devpost to join the conversation.