Inspiration

We humans (mostly) view each other through our eyes and what we can see in the physical world. While machines can not literally see, they can classify images into categories. Let's explore how machines view us people.

What it does

The user enters the name of a person, prompting a behind-the-scenes Google image search that retrieves a large number of image URLs. The application utilizes the Clarifai API to classify the image URLs with a frequency list of textual tags and renders a word-cloud highlighting how machines might view the person in question.

How I built it

Python Flask, Google Custom Search, Clarifai API, Heroku, JQcloud, Skeleton

Challenges I ran into

  • Learning how to write a dynamically driven web app.
  • Deployment and environment configurations.

Accomplishments that I'm proud of

Developed a web solo after pivoting half-way through the hackathon.

What I learned

  • Python Flask web framework
  • Skeleton CSS framework
  • Custom Google Search API
  • Clarifai API
  • JQCloud Jquery plugin
  • Heroku PaaS

What's next for Binary View

Parallelizing the image classification to larger number of images will provide more interesting and significant results. Additional data visualizations will prove useful in discovering how machine algorithms "view" people.

Share this project:

Updates