Billboard Gaze (BB-Gaze)

One of the greatest benefits of online advertisements is the ability to definitively track user engagement. With billboards, however, it's not so easy. With the relm and reliability of autonomous vehicles expanding daily, we wanted to harness the immense array of visual data, both outside and inside of the vehicle, to create a product that will use real-time gaze tracking software to detect when a driver looks at a passing billboard. This enables companies to compile more accurate counts of the number of people engaging with their roadside advertisements.

How we built it

In order to build BB-Gaze, we first developed a program in Python that allowed us to simultaneously capture webcam images of a person's face while looking at their cursor's location, and store the screen coordinates of the cursor. We then crowd-sourced over 2200 data points and used the Dlib library to do facial landmark detection and crop the images to only contain the subjects' eyes. Next, we crafted a convolutional neural network using Keras to train a model that accepts real time webcam images of a person and moves the cursor to the location on the screen at which they are looking at. We then paired this function with a set of road images from a self-driving car that contained annotations of all the billboards the car encountered. By combining these into a demo that plays a sequence of images that simulate driving a car, we are able to detect whether a person looks at each billboard that passes or not.

What's next for BB-Gaze

In the future, we want to improve our model so that our eye tracking software can more accurately detect a user's gaze.

Built With

Share this project:

Updates