Inspiration

Online shopping is often a game of "Guess the Fit." We see clothing we love on professional models, but we have no idea how they will drape on our unique bodies. This leads to massive waste, high return rates, and a disconnected shopping experience. I wanted to build a platform that doesn't just show you clothes, but lets you feel the transformation through an immersive, high-fidelity lens.

What it does

Silhou. is a virtual dressing room. It allows users to synthesize a digital twin (Avatar), curate a wardrobe, and use Neural Diffusion to see realistic outfit try-ons instantly. Users can save wardrobes and avatars to mix and match and save outfits they love to revisit them. They will have access to price tags as well to have an idea of cost and find what outfit is best for them.

How I built it

Frontend: React 19 + Vite Backend: FastAPI, Fashn Neural Try-On API Infrastructure: Supabase

Challenges I ran into

My main challenge was inconsistency with results and finding the correct pipeline that would be efficient and prevent any hallucination. I also had to test many different diffusion models that would not work best with multiple clothes and angles.

What I learned

I got a better understanding of diffusion models and how they are used and Virtual Try-On (VTON) models that specialize in fitting clothing. This is also my second full stack application that I worked on alone and I learned a lot on the full stack end.

What's next for silhou.

The future of Silhou is a full 3d avatar. I plan to implement a full 3d avatar that users can interact with in a virtual space to get a true grasp of how an outfit is supposed to look on them. The goal is to stray away from 2D avatars and I would want to be able to also implement a physics engine.

Built With

Share this project:

Updates