Inspiration

As college classes are getting bigger, it can be hard for instructors to accurately take attendance. However, this piece of information is critical for them to reflect on how they're doing (and how successful they are on engaging the class). In Class offers a solution to this problem. It gives instructors a way to accurately get data on their class, and also let students reflect on this data.

What it does

Our app automates the process of taking attendance in classrooms by using face recognition when students enter the classroom. Teachers can keep track of their student's attendance through the mobile app and students also have the ability to verify their attendance. The app also tracks students' daily mood throughout the class. This data is considered private and can only be viewed by the student themselves.

How we built it

The front end of this app was built with react-native and the face recognition was built using the Python face recognition library. We are storing out data using firebase firestore.

Challenges we ran into

A challenge that we ran to was trying to set up the face recognition. This was our first time working with this Python library and had trouble getting the environment correct to be able to run it. However after a few hours of browsing we finally were able to get it running, but we lost a good chunk of time at the beginning of the hackathon.

Setting up mood recognition is also another challenge. It takes some time to process each frame of the video and pass it to Google Cloud Vision API. Since there are multiple frames per second, this can make the app laggy. We also haven't figured out the best way to interpret the data and store that. One option is to make a database call every time if there is a change in users' mood. But that'd be too inefficient and will only slow down the app even more.

In addition, this was also our first time working with react native, so there was a big learning curve of trying to understand how mobile development works. We had a hard time connecting our app to the database, figuring out how to correctly style the components in the app, and how to pass data between states, and how to navigate between pages in an app.

Accomplishments that we're proud of

  • A working facial recognition (well, kinda)
  • A working mood recognition
  • Taking out first step into learning mobile development.

What's next for our project

  • On the backend side, we will continue to refine our face recognition and mood detection algorithm so it produces more accurate set of data.
  • On the frontend side, the app is still lacking of native functionality (e.g, we were not sure how to use the native back button to navigate through the app). So, we'll get ourselves more familiar with all functionalities React Native has to offer.

Built With

Share this project:

Updates