Inspiration

Fashion is such a personal part of everyone’s identity. It reflects culture, personality, and tells YOUR story. But sometimes we are uncertain about our own sense of fashion, or we want a little inspiration to change things up. FitCheck helps in these times of uncertainty, giving users both fashion inspiration and shopping advice!

What it does

To start, users are given a set of outfits to swipe through in "Tinder style". Based on the outfits that the user initially liked, they can view some outfits or products recommended to them by the app. They can even go through their saved outfits on pin boards! Every iteration of swipes, they will be recommended a set of outfits, tailored to their liking, encouraging engagement.

How we built it

We trained a model with the Pinterest Fashion Compatibility Dataset found on Kaggle through an API, which inputs various outfit images and outputs a recommended product that suits that input image or an item that should be added to the fit. To evaluate the effectiveness of our recommendation, we planned to implement the Amplitude API to measure user interaction, feeding that back into the loop to further improve the recommendations. Our front-end was developed using Figma, which mimics the Tinder app's swiping UI and Pinterest's pinned boards UI.

Challenges we ran into

Due to unforeseen circumstances, some members had to leave the hackathon early or drop out entirely. This resulted in our team dynamics changing drastically in the limited time we did have to work on our project. Connecting the front and back ends of our project proved to be our greatest challenge as our teammembers either had expertise in front-end or back-end design, but not extensive knowledge on connecting the two parts.

Accomplishments that we're proud of

As beginners to hackathons and building software-based projects in general, we successfully trained the outfit recommendation model using 100 training images, with 10 epochs (small number of training data and epochs due to lack of memory, GPU and time constraints). Although our model is still fairly inaccurate, the test case was able to recommend white items to a white-coloured outfit, which we consider a personal success.

What we learned

We learned how to properly train a model, and that training the model on more epochs does not guarantee more accuracy. Some members also got exposed to new Python libraries like matplotlib, exploring how methods from that library are implemented into code to further extend the usability of Python.

What's next for FitCheck

We would like to simplify and rewrite some of the AI-generated code for maximum clarity, optimal processing time and memory usage. We would also like to train the model using a larger set of images (closer to 5000 images instead of 100) for better accuracy. Fully integrating the amplitude agent to measure the success of our model. Learning to take a front template and replace core components. Finding a higher-quality open source dataset to pull outfit inspiration photos would allow users to swipe a more diverse range of outfits continuously.

Built With

Share this project:

Updates