Inspiration

The fashion industry produces around 10% of the world’s carbon emissions and is using copious amounts of potable water and precious energy. By keeping clothes for more than 9 months, the clothes start reducing their environmental impact by more than 30% (because you aren't buying new clothes). Our team wanted to find a way to reduce our everyday impact on the environment and help people become more environmentally mindful. People spend months of their lives deciding on what to wear, with many possible patterns and colours in their closets, each for different temperatures and seasons. Magazines and fashion websites always suggest new clothes. We want people to “shop their own closets”, trying different combinations of clothes they already own. We decided to create StyleEyes order to encourage sustainable fashion choices, reduce the amount of time needed to get ready, and improve their fashion style.

What it does

The user takes a picture of what they’re currently wearing or plan to wear, and StyleEyes uses a custom machine learning algorithm to provide recommendations on accessories, patterns, and colour combinations based on the user’s existing closet. We used colour theory, trends and general fashion styles to determine the best outfit. StyleEyes also informs the user about their clothes’ environmental impact, encouraging reusability through many unique options for each item.

How I built it

First, we trained Microsoft Azure to detect different textures, colours, and patterns on pictures of people wearing clothes by selecting a variety of pictures and manually tagging colours. We then incorporated our best iteration into Android Studio using asynchronous HTTP calls to create a mobile application. We developed its user interface and functionality, which includes a custom method to give advice based on tags found, as well as recommending an accessory from the user’s current “closet”. The “closet” also has a calculator for the approximate environmental footprint, approximating costs of accessories based on existing data. The app was uploaded and tested on our Android mobile phones.

Challenges I ran into

We had difficulty building a custom machine learning model with our own tags, since it required a lot of iteration and training. We originally drew our bounding boxes incorrectly, which affected our early test results. We erroneously used machine learning to detect colour, and in hindsight it would have been better to detect colours through more simple image scanning. There were challenges connecting the API to the Android Studio project. The Microsoft Azure SDK was not working properly with Android Studio, so we had to manually do an HTTP call using an asynchronous base class. We were also inexperienced with Android Studio so switching contexts and activities was difficult, especially triggering the context switch from the conclusion of the HTTP call, as that was an asynchronous static context, whereas contexts must be switched from a non-static context.

Accomplishments that I'm proud of

The UI looks quite sleek, and it works perfectly on a physical device. After many iterations, our machine learning model works very well with pattern recognition. We included carbon footprint calculations to show the environmental impact of the user’s closet.

What I learned

Conda is the superior method of installing Python libraries. We learned how to both properly and improperly train machine learning models, and discovered that bounding boxes need to be larger and more varied in shape for accurate results.
Some of us learned how to properly use Git and how to commit, push and pull without disrupting the workflow.

What's next for StyleEyes

Due to limited time, our machine learning solution was extremely primitive. We found an online database of clothing classification but it required emailing alumni from the University of Hong Kong, which we didn’t know if they would respond in time. With self-built server-based classification, we could use the database to pull example clothing from and also get more accurate tagging. Another possible feature would be uploading an inventory of your closet so that the app could recommend clothes that you already have instead of just matching accessories. In the future, StyleEyes will be able to recommend an entire outfit based on factors such as weather, temperature and style. The clothing recommendation algorithm needs work as well. This could be helped by a redesign of the machine learning tagging system, using photo processing rather than machine learning to detect colours, and saving the machine learning for patterns and types of clothing instead.

How to Use StyleEyes

  1. Open the app. The home screen will show the accessories in your closet to keep track of what you have. When you click on the button on the upper left-hand corner, it takes you to a page where you can see the environmental impact of your accessories.
  2. From the home app, you can tap the “style me” button which pulls up a camera:
  3. Take a clear photo of your outfit and make sure you like it. Our app analyzes your outfit and tells you our recommendations for accessories
  4. Read recommendations based on your clothes’ colours, textures, and patterns.
  5. Save time, money, and the environment!
Share this project:
×

Updates