We were thinking about the app that allows blind people to video call seeing people in order to help with their everyday lives. One problem these volunteers help with is choosing a matching outfit. We decided that we can help people who are blind or color-blind to become more independent in this everyday task using computer vision.

What it does

Using object recognition, our app looks at various articles of clothing and helps decide if those items match or not. Non-matches suggest different color options that would work better.

How we built it

Using Azure's Custom Vision, we were able to input a set of clothing and use that to identify the article of clothing, along with it's color. Then, we check to see if the color scheme is a popularly accepted "match." If it is, the app announces this through text and voice. If it is not, the app announces better color choices.

Challenges we ran into

One hard thing was trying to figure out the perspective from a blind/color-blind person. This meant thinking about the interface, and if they could identify color suggestions we gave. For now, our app is geared towards color-blind people and assumes that the person will use this on multiple articles of clothing until they can find something that matches.

Accomplishments that we're proud of

We are very proud of how we used Azure's Custom Vision in a way to better someone's everyday life.

What we learned

We learned how to use React and Azure.

What's next for Matchi

Hopefully we can make it into an audio interface to make this more accessible to blind people.

Share this project: