The process of shopping is inundated with boring steps related to purchase. We wanted to transform a consumer's shopping experience. Daisy's closet utilizes image and voice recognition and the internet of things to make retail as streamlined as possible with all activity easily trackable on the user's own smartphone.

On the retailer's end, we make inventory movement and consumer analytics easily accessible with a real time web app.

What it does

Daisy's closet is an IOT shopping experience. Take a clothing item off the rack and try it on. Our system will automatically decide which object has been removed and add it to your virtual shopping cart accessible from your mobile device. Alternatively, use voice commands to add clothing to your shopping cart. Manage your final transaction on your mobile device and no face time with sales associates are necessary.

All inventory is tracked for the retailer and reported in an easy to use and understand web app.

How we built it

The Kinect takes a snapshot of the individual and uses Project Oxford to analyze the image and extract the QR code. After decrypting the QR code, the metadata of the clothing item is transferred over an Azure database to add to the user's mobile queue on the IOS app.

To purchase, using the Echo Dot, our custom Alexa skill with AWS integration waits for voice commands such as "Buy this jacket for [customer name]" and "Checkout cart" to connect to and add to the user's mobile queue.

All activity is also routed through Firebase which the retailer facing web app uses to report inventory movement.

Challenges we ran into

Implementing the automatic recognition of articles of clothing with the Kinect camera searching for a QR code on the body of the article was the most difficult part.

Accomplishments that we're proud of

We are catering to both the consumer and the retailer for an end-to-end system. The user experience is as organic as possible with no extra hassle beyond taking the clothes off the rack. The retailer simply accesses the web app to see real time data analytics of how their products are moving on the floor.

What we learned

We learned how to integrate several user commands for a single purpose. Users can buy from Daisy's Closet using our Kinect image tracking when physically removing the clothes from the sensor tracked area or using voice commands with Amazon Alexa.

Future implementations

We want to integrate Apple Pay or online money transaction systems to let users pay directly from the app.

Share this project: