Inspiration

I see a lot of items I like on Google Images or Google Shopping, but I don't know how to take that search further, based on only those items I liked. Due to the limitations of language, you can't fully describe all the features of an image and compare. With the power of CNNs, we compare on another level.

What it does

The user can query a term, such as "blue jacket", and there will be an initial result done only on language. After that, the user can select however many images they think are close to what they want, and then search again. Then, magic happens and new results populate based on image similarity.

How we built it

We used Tensorflow and Keras for the CNN to extract features from ResNet architecture, one layer before softmax. We trained the model using a subset of DeepFashion dataset and transfer learned on that. We scraped Google Shopping for 2000 listings, using a permutation of colors and articles of clothing, for the item titles, links, and images. Then we use a cosine similarity to compute image similarities after the images have been passed through the network.

Challenges we ran into

We both don't know front-end. We also used Google Cloud for the first time. There were a lot of obscure errors we saw for the first time.

Accomplishments that we're proud of

We built our own deep learning tool instead of relying on any machine learning APIs. Our tool was based off of intuition, such as using the CNN fully-connected layer for image features. We experimented with tools we haven't used before, like Google Cloud, and we are proud to have a working solution with just a team of 2 + Matt who wrote our frontend in 30 minutes.

What we learned

Learn front-end for the next hackathon

What's next for Clothes Similarity

Update similarity function to incorporate NLP Make a prettier front-end

Share this project:

Updates