What is UI Fit?

Have you ever bought clothing online that simply did not fit you well? Introducing UI Fit, a virtual fitting room that is accessible and personalized wherever you go. Simply snap a selfie of yourself and watch how our program puts you in the clothing you wish to buy - next, just check yourself out and see if it fits well!

Inspiration and Relevance

Online retail commerce has revolutionized the world of shopping and has boomed into a multi-billion dollar industry. Despite its success, there still exists doubt by many shoppers who distrust online products. Between unclear sizing charts and inconsistent measurements, many shoppers are frustrated when their products arrive and vastly differ from their expectations. Our goal is to improve online shopping further by bridging the gap between consumers and their products.

By 2023, apparel retail within the US’ e-commerce industry is projected to generate more than $145.8 billion. Yet, as lucrative as e-commerce may seem, there is one factor that actively hinders online shopping of clothes: the ability to “try on and test out” what one purchases. In effect, nearly every third article of clothing purchased online is returned. What does this entail? Research indicates that by 2020 return deliveries alone will drive $550 billion losses in the US (this, too, is exclusive of restocking and inventory losses!). Indeed, It is this conundrum of “online fitting” from which UI Fit has emerged.

What it does

Our web application takes in images of the user in 3 poses, their height and various clothing parameters and returns the fit of that item of clothing on the user. It will produce an image that is super-imposed onto the user, its overall predicted fit (ex. "Well-fitted", "Loose") and specific body fittings (ex. "Shoulders are not well-fitted").

How we built it

With a front-end based in React.JS and a back-end based in Python, our project began with the essence of image detection and recognition of the key parameters of the body. As such, we utilized a machine-learning model named open-pose (https://github.com/CMU-Perceptual-Computing-Lab/openpose) to take in a full-body image of a user and plot markers on the aforementioned body parameters. Our program then accurately calculates and obtains measurements tailored to each body. These are then used in the analysis stage where the fitting is determined. Finally, the user's inputs are compared with online measurements stored in JSON, and a fit output with specific details is computed. What you are left with is a chosen article of clothing that has been superimposed onto your body, so that you can get a virtual sense of how well it fits your body!

Challenges we ran into

The largest challenge we faced in this project was connecting our front-end and back-end program. Image processing was hard to communicate between the programs and created a disjoint between user interactions and the outputs. Furthermore, we aspired to leverage Firestore's database management and retrieval systems (rather than doing everything locally), although the pressing time constraints meant that such aspirations would have to be pursued after the hackathon :)

Accomplishments that we're proud of

We're really glad to have worked with a machine learning model and implemented some training to produce an applicable result. We were eager to learn the basics of ML and this project gave us a good understanding of it.

What we learned

Collectively, we all had different backgrounds coming into this project and learned more about languages and interfaces that we weren't familiar with.

What's next for UIFit - The Virtual Fitting Room

1) 3D Modelling

2) Live, Interactive Virtual Mirror Systems

3) VR Integrations

Built With

Share this project: