ReactOnFly

Inspiration

Everyone wants to browse through their Facebook newsfeed, reacting to different kinds of posts. Even though clicking may seem easy, wouldn't it be easier if all you had to do was smile? We thought it'd be cool to work with Microsoft's Emotion API and with a bit of facial detection we could determine a person's happiness level and help make a natural reaction to a particular post for them.

What it does

This Chrome extension takes pictures of the user every 2 seconds, and the Microsoft Emotion API with the Microsoft Azure storage determines if the user is smiling at their screens/ at their friend's Facebook posts. If the user is recorded to be smiling for a certain duration, the Chrome Extension will automatically like the Facebook posts on the screens. -- The chrome extension will "react on fly!"

How we built it

We used PyGame to take a photo of the user every two seconds. This photo is then uploaded to Microsoft Azure, that uses Microsoft Cognitive Services API to determine if the user is smiling. If the user is smiling, it returns a true value which is then picked up by the chrome extension. The extension keeps polling the Python-Flask application every five seconds to detect any changes in emotion. For happy faces, the Facebook post is automatically liked on behalf of the user.

Challenges we ran into

The common programming language among our group members was Python, so we had to implement most of the functionality using this language. We planned to use OpenCV to activate the camera, but we decided not to go with it because it took too much time to build. Therefore we chose PyGame instead. We spent much time looking for the Azure SDK for Python and tried to find ways to put our application on the Azure cloud. When we finally started to consolidate all of the small pieces of our functionalities, our PyGame program stopped taking photos, but then, we fixed it soon (phew).

Accomplishments that we're proud of

We are glad that we worked together and we had such a great time. There were a lot of troubles, but we did not give up and solved them one by one.

What we learned

We learned how to use Microsoft Cognition API to analyze people's facial expressions. We also learned how to implement Facebook API to like a post automatically and develop a chrome extension. All of us had a fun time working together.

What's next for

We plan to add more functionalities in order to make it more user-friendly, based on the feedback we receive from our users

Built With

Share this project:

Updates