Inspiration
We don’t buy a jacket because a studio photo looks "correct" - we buy it because we saw it on someone we admire. Fashion moves through an imitation funnel: a celebrity's street style post or an influencer’s reel normalizes a silhouette long before we see it on a rack. Despite this, e-commerce sites still behave like static paper catalogs, stripped of the "real-world" energy that actually drives us to purchase.
We wanted to bring that imitation energy directly to the product page. Mirror was born from the idea that the "Where did they get that?" moment shouldn't require five open tabs and a leap of faith. We set out to combine the aspiration of the influencer feed with the reality of your own body, making sure the bridge between "that looks great on them" and "this looks great on me" is seamless.
What Mirror does with that idea
Mirror is a Chrome extension that turns every product page into a social discovery feed. After a one-time setup on our web app for reference photos and consent, the extension lives where you shop, replacing the static catalog experience with active social proof.
On any product page, Mirror injects a Worn by strip - a curated gallery of influencers and community members wearing that exact piece in real-world settings. This captures the "imitation funnel" directly at the point of purchase.
To bridge the final gap between their feed and your body, Mirror generates high-fidelity Virtual Try-ons. You can go beyond a flat preview by creating Editorial Stills - cinematic, magazine-style images of yourself in the fit and sharing them with your Circle for an instant verdict. By combining influencer inspiration with your own digital twin, Mirror lets you shop with the confidence of a stylist-backed decision.
How I built it
Supabase for accounts, data, storage, and realtime. FastAPI on Railway for the API, job queues and workers for anything that takes more than a couple seconds. Extension and web are TypeScript, backend is Python. Same stack end to end so one session can drive try on, search, and share.
What I learned
The real driver of fashion is imitation, not just information. I realized that a high-fidelity try-on is technically impressive, but it’s often secondary to seeing the item styled in the wild. If you only show the user a 3D model of themselves, they see the fit, but they don't see the vibe.
The product only clicked when I combined the "Worn by" social proof with the personal try-on. "She wore it" provides the aspiration and proof of style, while "it looks like this on me" provides the final permission to buy. True confidence comes from seeing that an outfit works for someone you admire and seeing that it doesn't lose its magic when it's rendered on your own body. Integrating both onto the product page bridges that gap between a celebrity's feed and your own closet.
Challenges
Getting real worn images at scale is messy. Rights, duplicates, junk images, slow third party search. We filter, cache, and fail soft so the strip can be honest when it has little to show. Keeping reference photos and consent tight is non negotiable for a feature that touches bodies and faces.
Built With
- apify
- fashn
- fastapi
- gemini
- railway
- serpapi
- typescript
Log in or sign up for Devpost to join the conversation.