Inspiration

After years of teaching methods remaining constant, technology has not yet infiltrated the classroom to its full potential. One day in class, it occurred to us that there must be a correlation between students behaviour in classrooms and their level of comprehension.

What it does

We leveraged Apple's existing API's around facial detection and combined it with the newly added Core ML features to track students emotions based on their facial queues. The app can follow and analyze up to ~ ten students and provide information in real time using our dashboard.

How we built it

The iOS app integrated Apple's Core ML framework to run a CNN to detect people's emotions from facial queues. The model was then used in combination with Apple's Vision API to identify and extract student's face's. This data was then propagated to Firebase for it to be analyzed and displayed on a dashboard in real time.

Challenges we ran into

Throughout this project, there were several issues regarding how to improve the accuracy of the facial results. Furthermore, there were issues regarding how to properly extract and track users throughout the length of the session. As for the dashboard, we ran into problems around how to display data in real time.

Accomplishments that we're proud of

We are proud of the fact that we were able to build such a real-time solution. However, we are happy to have met such a great group of people to have worked with.

What we learned

Ozzie learnt more regarding CoreML and Vision frameworks.

Haider gained more experience with front-end development as well as working on a team.

Nakul gained experience with real-time graphing as well as helped developed the dashboard.

What's next for Flatline

In the future, Flatline could grow it's dashboard features to provide more insight for the teachers. Also, the accuracy of the results could be improved by training a model to detect emotions that are more closely related to learning and student's behaviours.

Share this project:
×

Updates