What it does

We built a way to browse Amazon with your voice and your phone camera. You can scan the things you like around you, talk to a trained ML model about the reviews, and visualize the item in Augmented Reality.

How we built it

  • Scrape Amazon Review Pages: A heavy amount of querying Amazon webpages lets us access millions of opinions of products across the site.
  • Bidirectional Attention Flow Model: we trained a model using a year-old ML method called bidirectional attention flow, which is being heavily used in academics.
  • Speech to Text Translation: Using Houndify and MacOs, we enable a conversation between you and the Amazon customer of your choice.
  • Google Cloud Image Search: Naturally, we implemented Google's Vision API to label the objects in the user's camera. You just snap the button in the corner, and your screen gets searched across the internet for the best options!
  • AR Virtualization: To bring the webpage to life, we provide a platform for users to visualize not only the product in question in 3D space, but also to display Amazon star reviews right alongside it.

Greatest Challenges

  • Adding textures in AR got a bit frustrating.
  • Amazon is not very welcoming to people scraping the pages...so it was a tedious process. Not to mention, the Amazon signature for the request was a doozy.
  • We used not one, but two computers to cover all of our functionality. As a result, we'd lose touch between them when we'd come in and out of WiFi.
  • Time. Never enough of it.

What's next for Emporium

  • Ability to purchase products through Emporium
  • Provide multiple Amazon query results in Augmented Reality
  • Attach the Augmented Reality to the real-world location of the item being investigated

Built With

Share this project: