Project Summary

Despite the advances in the field of ad analytics, we are limited by below-par means of content marketing. Marketing your ad can often be unfriendly to the pocket and frustrating given that you are unaware of how your audience would respond. When we began this project, we had a simple goal in mind, easing the process of making your content/product reach out with the help of rentable EEG devices. Our software uses a 3-step pipeline to process emotions from the MUSE-2. As the user wears the MUSE, the user's data is stored in a csv format. This data is then processed using a sliding-window technique to extract features for classification. These features are fed into a gradient boosted tree model, which then classifies the emotions displayed during the period as either "Positive", "Negative", or "Neutral." The model was pre-trained on 20 minutes of labelled data collected while watching happy, sad, or neutral videos while wearing the MUSE-2. We were able to achieve 99.5% accuracy on our validation set, and reliably replicated similar accuracies during live testing. We also implemented a demo site, which demonstrates the use case of the project for wide-scale market analytics. After collecting MUSE data from users, marketing agents can log into the demo site and upload the CSV files collected to an API that returns the model's predictions. Using this, marketing agents are able to gather data from crowds of users, process them rapidly, and store the insights gained in a database.

Share this project:

Updates