Inspiration

Have you ever wondered if your outfit looks good on you? Have you ever wished you did not have to spend so much time trying on your whole closet, taking a photo of yourself and sending it to your friends for some advice? Have you ever wished you had worn a jacket because it was much windier than you thought? Then MIR will be your new best friend - all problems solved!

What it does

Stand in front of your mirror. Then ask Alexa for fashion advice. A photo of your outfit will be taken, then analyzed to detect your clothing articles, including their types, colors, and logo (bonus point if you are wearing a YHack t-shirt!). MIR will simply let you know if your outfits look great, or if there are something even better in your closet. Examples of things that MIR takes into account include types and colors of the outfit, current weather, logos, etc.

How I built it

Frontend

React Native app for the smart mirror display. Amazon Lambda for controlling an Amazon Echo to process voice commands.

Backend

Google Cloud Vision for identifying features and colors on a photo. Microsoft Cognitive Services for detecting faces and estimating where clothing would be. Scipy for template matching. Forecast.io for weather information. Runs on Flask on Amazon EC2.

Challenges I ran into

  • Determining a good way to isolate clothing in an image - vision networks get distracted by things easily.
  • React Native is amazing when it does work, but is just a pain when it doesn't.
  • Our original method of using Google's Reverse Image Search for matching logos did not work as consistently.

Accomplishments that I'm proud of

It works!

What I learned

It can be done!

What's next for MIR

MIR can be further developed and used in many different ways!

Another video demo:

https://youtu.be/CwQPjmIiaMQ

Built With

Share this project:
×

Updates