Inspiration

We previously built a smart mirror project at TreeHacks focused more broadly on clothing, but through that process we realized something important: people do not just struggle with choosing outfits — they struggle even more with understanding their own style identity.

A lot of users do not know whether gold or silver looks better on them, what shades for clothing complement their skin tone, or why certain accessories feel “right” while others do not. Yet this is something people genuinely care about and are willing to spend a lot of money to learn — for example, many people pay hundreds of dollars for personal color analysis consultations, including specialized services in places like Korea.

That insight led us to PHIRA. We wanted to build a system that could make this kind of fashion intelligence accessible, personalized, and effortless. Our vision is to gamify shopping and turn styling into something interactive: an AI agent that understands what you biologically look like, what you already own, and what would actually flatter you. We chose the mirror as the medium because it is already the most natural interface for this behavior. People already use mirrors every day to evaluate how they look, so we wanted to bring intelligence directly into that familiar ritual.

Why it’s unique

What makes PHIRA unique is that it sits at the intersection of fashion, biology, memory, and interface.

First, it uses actual visual signals from the body — skin tone, wrist vein hue, contrast, and facial features — to infer what flatters a user. Most people do not have access to this kind of insight unless they pay for a professional consultation. We wanted to make that intelligence instant and interactive.

Second, it does not treat style as a one-time quiz. PHIRA also looks at what the user already owns through Gmail purchase history, which gives it memory and context. That means the system can move beyond generic recommendations and start acting more like a personal stylist that knows your existing wardrobe and taste.

Third, we deliberately chose the mirror as the interface. Shopping and personal styling are visual, embodied behaviors. The mirror is already where users naturally evaluate themselves, so instead of forcing users into a traditional shopping app flow, we built an agent that meets them in a behavior they already do every day.

Ultimately, our goal is to make shopping feel less like search and more like play: personalized, visual, conversational, and effortless.

What it does

PHIRA is an AI-powered smart mirror and personal styling agent focused on accessories. It helps users discover what suits them based on both biological features and personal taste.

When a user stands in front of PHIRA, it performs a color analysis using their face and inner wrist to understand undertones, contrast, and flattering palettes. From that, it identifies which metals, tones, and accessory colors are most complementary. At the same time, PHIRA can connect to the user’s Gmail purchase history to understand what they already own, what brands they gravitate toward, and what style patterns they show over time.

This makes PHIRA different from a generic recommender. It is not just suggesting “popular” items — it is combining biological anatomy, visual analysis, and behavioral data to create recommendations that feel highly personal. It then presents these recommendations through real-time virtual try-on, voice narration, and a mirror-based interface that feels natural and low-friction.

How we built it

We built PHIRA as an AI-powered smart mirror focused on accessory styling and personalized shopping. On the frontend, we used Next.js, React, and Tailwind CSS to create the mirror interface and overall user experience. For real-time body and face tracking, we used MediaPipe to anchor accessories directly onto users during virtual try-on. We used Supabase for product storage, metadata, and retrieval, and Vercel for deployment.

Our recommendation system combines two key signals. First, we run color analysis on the user using face and inner wrist images to infer undertones, contrast, and flattering color families. Second, we connect to the user’s Gmail through OAuth and analyze purchase history to understand what they already own, what brands they like, and their broader style patterns. By combining biological signals with purchase behavior, PHIRA acts less like a generic shopping tool and more like a personal stylist.

To make recommendations fast enough for a live demo, we preprocessed and tagged accessory images ahead of time, standardized them for cleaner overlays, and cached ranked results so the system could quickly serve the best matching items without recomputing every recommendation from scratch.

Challenges we ran into

One challenge was Apple Wallet integration. At first, we did not have an Apple Developer account, and approval takes time and also requires a paid account. Because of that, we initially built a Gmail-based delivery flow so users could still receive their personalized results. Once we gained access, we were able to implement Apple Wallet as well.

Another challenge was realizing how constrained Apple Wallet UI actually is. We originally imagined a more visually rich experience, but Apple Wallet only supports a few structured text regions and very limited layout flexibility. We had to simplify the pass design significantly and decide which information mattered most to preserve a clean user experience.

We also ran into issues with Gmail OAuth. Initially, the integration only worked on our own accounts because the app was configured around our developer credentials. Since we wanted anyone at the demo to scan a QR code and try the product themselves, we had to properly deploy the app, configure the OAuth flow for external users, and understand how Google test users, consent screens, and publishing worked across different accounts.

A separate challenge was standardizing fashion product images. Publicly available accessory images vary a lot in angle, crop, and background, which makes virtual try-on difficult. We solved this by cleaning and standardizing the images through background removal and regeneration workflows so they would work much better with our MediaPipe overlay system.

Finally, accessories are much harder to map than clothing. Jewelry and smaller items require much more precise landmark placement because facial and body anchor regions are smaller and more sensitive to variation. We improved this by grouping items into clusters, refining our mapping logic, and tuning how the system handled different shapes, sizes, and placements.

Accomplishments that we're proud of

We are proud that PHIRA goes beyond generic fashion recommendations and introduces a genuinely new interaction model for shopping. Instead of asking users to search through products manually, PHIRA uses a mirror, which is already a natural part of how people get ready, and adds intelligence directly into that experience.

We are also proud that we combined biological analysis and personal purchase history into one recommendation pipeline. That makes the output feel much more personalized and useful than either color analysis alone or shopping history alone.

Finally, we are proud that we got the system working end to end in a live demo setting: real-time color analysis, Gmail-based personalization, virtual try-on, recommendation retrieval, voice output, and post-session delivery.

What we learned

We learned that personalization becomes much stronger when multiple forms of context are combined. Biological signals tell us what flatters a user, while Gmail purchase history tells us what they actually like and already own. Together, those two layers create recommendations that feel much more believable and personal.

We also learned that platform constraints matter a lot. Apple Wallet, OAuth flows, and real-time computer vision all introduced practical limitations that forced us to simplify, prioritize, and design around real deployment constraints rather than ideal ones.

Most importantly, we learned that shopping can feel much more engaging when it is interactive and embodied. Using a mirror as the interface made the experience feel much more intuitive than a standard e-commerce flow.

What's next for Phira

Next, we want to make PHIRA feel even more like a true personal styling agent. That means improving virtual try-on accuracy across more accessory categories, expanding the product database, and making recommendations even more adaptive over time.

We also want to deepen the personalization layer by better understanding what users already own, what they repeatedly gravitate toward, and how their preferences evolve. Longer term, we see PHIRA as a way to gamify shopping and turn it into a personalized, everyday ritual rather than a static browsing experience.

Built With

  • apple-passkit
  • claude-api-(anthropic)
  • elevenlabs
  • gmail
  • gpt-4o-vision
  • hono
  • mcp-(model-context-protocol)
  • mediapipe-(facemesh-+-pose)
  • next.js
  • node.js
  • openai-api
  • playwright
  • react
  • resend
  • supabase-(postgresql-+-storage)
  • tailwind-css
  • typescript
Share this project:

Updates