Retrieval and Try On and Chat demo videos available here: https://youtu.be/h3KNm2pQpDM, https://youtu.be/StoaA273B28.

🧠 Inspiration

The idea for IndiSearch came from a common frustration: seeing a great outfit on the street or social media and having no idea where to buy it. We wanted to bridge the gap between fashion inspiration and action, making it easy for anyone to turn a photo into a shopping experience, being able to find similar clothes and try them on online. By focusing on Inditex brands, we ensured users would have access to stylish, affordable options they could trust.

🛠️ How We Built It

We built IndiSearch using a combination of:

  • Frontend: Streamlit
  • Backend: Python and InditexTech APIs.
  • AI Model: A deep learning model to detect and crop the different clothes in the image based on transformers from HuggingFace. The retrieval is done by the InditexTech Visual Search API. (put link to model)
  • Virtual Try-On: Integrated model that uses pose estimation, clothes segmentation and person segmentation to do clothes wrapping on a person.

📚 What We Learned

  • Integrating different types of generative models.
  • Doing a full stack project.
  • Using an API.

⚠️ Challenges We Faced

  • The fact that the retrieval model is a black box.
  • Web scrapping difficulties to access the zara web and retrieve the images without making the web think we are bots.
  • All the amount of possibilites to make it work!

What's next for IndiSearch

  • Integrate a user-space where each search modifies the preferences of the user to provide more personalized suggestions.
  • Video try on.

Built With

  • ai
  • api
  • generative
  • python
  • streamlit
Share this project:

Updates