We were disappointed and somewhat frustrated that when we see a cool style out on the street, we had no way to find that outfit online without googling for hours on end. To remedy this inconvenience, we decided to make a fashion assistant that can do just that: redirect people to the site where the outfit could be found with ease.

What it does

It uses the device's camera in detecting the outfit that a person is wearing. Then the app redirects, within the app, to the webpage where the outfit could be purchases/found.

How I built it

We used the Swift language in Xcode to build the iOS app. Along with the swift backend, we also used CoreML, Vision, and Firebase to track the data of the outfits and storing it into an online database, where it could be accessed any time. We used Machine Learning and Vision to detect the outfit and map the trend of distinguishing between certain brands of clothes.

Challenges I ran into

Using WebKit in our app as it is fairly difficult to implement along with debugging our app

Accomplishments that I'm proud of

Our app finally worked in the end and we were very excited that our idea has never been replicated or created by anyone. We are proud of being a pioneer in the fashion assistant space.

What I learned

Implementing WebKit in our app and learning the ins and outs of it, as it was the first time we have used it in our apps.

What's next for QuickShopr

We hope to release our app to the App Store in hope of creating a fashion assistant that can help many others find the right style or dress they so desired at first glance at a person in real-life. Also, implementing our fashion assistant into Siri or Google Assistant

Built With

Share this project: