Inspiration
Every aspect of society is continually becoming interwoven with technology and information. From creating algorithms that have streamlined shopping experiences and marginally boosted company sales, to collecting heatmaps of entire cities during rush hours in order to build more efficient modes of transportations; extracting information from people is a centerpiece of society.
As a group, we asked ourselves, what is the one thing that is _ always _ lost when we look at data? The answer: emotions! As humans, we can easily judge whether someone is sad, happy, confused or just plain uncomfortable. The emotions of other people are very important to us, but often they are overlooked in data analysis. So we wanted to somehow seamlessly digitize facial emotions and track them over periods of time to better data analysis of businesses, art sectors and lectures.
What it does
Point any camera connected to the application to a crowd of people over a desired amount of time and afterward, head to our website to look at the emotional data collected over the period of time and input it into whatever knowledge discovery or customer satisfaction research you have! Just point, run and learn about what emotional data Facebit has to offer!
How we built it
We used a webcam in our demo but any camera works, and built the program using python to execute on a computer; with the online website serving as a place to look at the collected data. We made a python program using multi-threading to process constant instances of pictures being taken, emulating a live stream. This "live stream" is then itemized and sent into the Google Cloud Vision API database where it analyzes faces and returns both face detections and data of the degree of certain facial features. We then instruct the program to sift through the facial data and calculate an "emotional degree" where it can then be viewed online. The data is collected on a local server that we have built for the occasion using PHP and served onto our website that was created using Javascript.
Challenges we ran into
Our biggest issue concerned processing the data of "emotions" and displaying it online. We had to figure out how we can accurately and realistically present the data in a way that was understandable for someone to use. Displaying it online also served to be tricky as we had to create our own local server to write in the data and then construct a graph from said data, all of which was new to most of us.
Accomplishments that we're proud of
The bulk of what we thought was going to be hard (like actually detecting emotions), ended up not being as scary as it seemed! The moment we got back data from Google Cloud was astounding! We couldn't believe how detailed the data was and how much we could do with it!
What we learned
Each of us had to learn something new we had never done before! One of us had never even touched python before coding this! Someone had to tackle on dealing with processing data one PHP. Someone had to work with hardware (making the camera work) for the first time. And last but not least, someone had their first encounter with dealing with any sort of API. We each will definitely continue to further explore these new and exciting areas of technology.
What's next for Facebit
Facebit would ideally benefit from being on a portable device that's wireless and instantly starts streaming pictures and data online. Then once you turn it off, you can go and look at the emotional data online with the intention of utilizing it to further your business or research endeavors! We could even possibly use the data to start a database of "emotional reactions" to certain products, lectures, art pieces or even food! A collection of facial expressions and their degree would also help refine facial emotional detection.
Built With
- git
- google-cloud
- google-cloud-vision
- javascript
- php
- python
Log in or sign up for Devpost to join the conversation.