Inspiration
Back in January, I had a long conversation with Sophia about how Phia could pull user content into product detail pages and do stock keeping unit unification while also becoming a better personalized shopping tool. Although we were talking about TikTok and TikTok Shop, which isn't readily available to integrate, I still wanted to build something in the same realm. Pinterest is where taste lives, but it's a dead end: you save outfits, and nothing happens. I wanted to build the bridge from inspiration to checkout.
What it does
PhiaPinet connects your Pinterest account, runs AI vision analysis on every pin to extract garments, colors, and aesthetic tags, then builds a structured style profile from content you already curated. It uses that profile to find shoppable products that match your taste, size, and budget. Each product page shows matching pins from your boards, connecting your inspiration directly to things you can buy.
How we built it
React Native with Expo for the mobile app, taking advantage of native Liquid Glass UI. Convex for the real-time backend and database. Pinterest API v5 handles board and pin ingestion. For the vision pipeline, I built a two-stage system: first, a GCP Cloud Run function running OpenCV processes every pin image to extract colors, patterns, and style features, connected to a Kaggle dataset on the cloud that anyone can rebuild easily. Once OpenCV returns structured JSON, I pass that to GPT-4o mini's vision model to keep things cost-efficient and fast while getting high-quality garment classification. Product recommendations come from an AI agent that searches the web based on your style profile. The whole pipeline runs automatically once you connect Pinterest.
Challenges we ran into
Pinterest's Trial API has a 1,000 call/day limit, so I had to be strategic about batching pin fetches and capping boards per user. The pipeline was originally very slow and I had to do significant optimization work on the embedding pipeline for OpenCV. Getting OpenCV to return consistent structured JSON for garment extraction was particularly difficult and took a lot of iteration.
Accomplishments that we're proud of
The style DNA breakdown is genuinely accurate. When it says you're 29% avant-garde and 16% minimalist, you can see it reflected in the actual pins. The onboarding flow got the strongest feedback at the hackathon. The idea that you connect your Pinterest and it returns which brands you'd actually enjoy was something people consistently responded to. Multiple people told me this solved a real problem they have with fashion apps, including Phia's own onboarding, where you see a wall of brand names as plain text with no visual examples and it's overwhelming. PhiaPinet skips that entirely by learning your taste from what you already saved.
What we learned
Pinterest pins carry way more signal than expected. The source link field alone gives you brand and often exact product info. Combined with vision analysis, you can build a surprisingly rich taste profile without asking the user a single question about their preferences. The biggest insight was that the best onboarding is no onboarding. Your Pinterest boards are your style quiz, already filled out.
What's next for PhiaPinet
Community-powered product pages where everyone's pins show up as social proof on PDPs. Live style matching so users can discover people with similar taste. TikTok integration for video-based style analysis. And deeper SKU unification against Phia's actual 350M product catalog. I also think there's a real opportunity to rethink Phia's app onboarding using this approach, even though the extension remains the core product.
Built With
- convex
- expo.io
- gcp
- opencv
- react-native
Log in or sign up for Devpost to join the conversation.