Inspiration

For most of the population, picking out clothes to purchase can be a relatively hassle-free job– something we tend to take for granted. All it takes is a stroll down a mall or a scroll down a site for us to have a full shopping cart, ready to be purchased. In contrast, for the 20 million people in the US who are visually impaired, the task of picking clothes to buy can be incredibly daunting and stressful.

We were inspired by the struggles of a woman who is blind, as described on her blog. She wrote about how much of her shopping experience was marred by a lack of accommodations to those who are visually impaired. Even online shopping– an experience created and celebrated for its convenience– is made difficult, as some of the most popular retail sites lack accessibility options, like alt text. Without any easy way to remotely evaluate the color, texture, size, fit, and style of a garment, blind users are forced to rely on other people, limit the possibilities of their creativity, or simply take a stab in the dark when it comes to buying garments.

But shopping should be an enjoyable, carefree pastime for everyone– not one in which there lies an accessibility barrier.

That is where moody fits comes in.

What it does

Our workflow starts with Hume AI’s incredible Emphatic Voice Integration technology listening to a user speak and internally outputting the highest emotions present in the user’s voice. These extracted emotions are then fed to an Open AI model, which scrapes through our database and allows for product recommendations. Once product recommendations are collected, they are shown to the user, allowing the user to either ‘swipe left’ (reject) or ‘swipe right’ (add to cart) on them. Finally, the user is able to view all favorite products in their shopping cart, with easy access to product vendors and descriptions.

Then, the text-to-speech functionality allows for the user response to be recorded and analyzed. Based on their verbal response, the appropriate action is taken with that specific product. Our use of Hume AI throughout the project ensures that accessibility is not a problem when it comes to moody fits.

How we built it

We employed React for our frontend and Python for our backend infrastructure. Our approach included integrating Hume AI as the empathetic AI agent and leveraging the OpenAI API to accurately match user emotions with personalized product recommendations.

Challenges we ran into

Implementing the Hume API was difficult and required a lot of trying it out in the playground before implementing it in our project.

Accomplishments that we're proud of

Despite the initial challenge of lacking access to open-source data for e-commerce apparel, we overcame this hurdle by scraping the internet to compile a robust database of products available for users to explore.

What we learned

We learned a lot about making API calls and how the front end and back end interact with each other, especially from using Flask for our Python backend.

What's next for moody fits

We'd like to open-source our platform and continue to add more functionality. We also plan on improving the UX by integrating an empathetic API agent to verbally present the recommendations to users, rather than the existing monotonous voice.

Built With

Share this project:

Updates