Inspiration
Our inspiration came from the fact that we all had to video-conference during lectures which made us feel uncomfortable. We wanted to create this application to make it so that online lectures were more student friendly, which will improve their mental healths.
MentalHealth
What it does
The application detects the students' emotions from their faces and replaces it with an emoji that depicts their emotion in real time.
How we built it
We used machine learning models with Face-Api.js library and Node.js with Express.js for handling our video to emoji conversion. We integrated with React and the Twilio API starter code to create our classroom/video broadcasting website. This front-end application gets the emoji representations of faces from the back-end.
Challenges we ran into
Some challenges we ran into included dividing up the work, learning the technologies we used and balancing the hackathon with our classwork.
Accomplishments that we're proud of
We're proud of the fact that we were able to replace the students' faces with an emoji that accurately represents their emotions.
What we learned
We learned how to use React, Node.js, and Twilio. We also learned JavaScript libraries such as Face-Api.
What's next for Banana-Call
We will pitch this idea to Zoom and BBCollaborate because we think this application can enrich students' experience with online lectures.
CoackroachDB #FreeTShirt
Credit to Twilio for the base application. Here is their Github https://github.com/twilio/twilio-video-app-react.

Log in or sign up for Devpost to join the conversation.