Inspiration

We’ve all had classes where it felt like no one knew what was going on, including the teacher. We wanted to build something that could help teachers “read the room” better by detecting how students are feeling, using AI.

What it does

CAIRE analyzes student facial expressions through a webcam or image dataset and predicts emotions like happy, sad, or neutral. It shows these results in a dashboard so teachers can see how the class is feeling in real time.

How we built it

We used the CK+ Extended dataset and a Hugging Face ViT model for facial emotion recognition. We wrote Python scripts to run predictions and built an interactive dashboard with Streamlit and Plotly. We also added a live webcam demo.

Challenges we ran into

We had to convert raw pixel data into usable images, format webcam input for the model, and keep everything smooth and simple for users. Managing multiple dashboards on different ports was also tricky at first.

Accomplishments that we're proud of

We built a working pipeline from data to live predictions, created a clean dashboard, and got the webcam feature running smoothly. The app is fast, local, and easy to use.

What we learned

We learned how to use Hugging Face models, handle image inputs, debug real-time webcam apps, and design dashboards that are accessible to non-coders.

What's next for Classroom AI for Real-time Emotion (CAIRE)

We’d love to add posture and voice analysis, make CAIRE work in virtual classrooms like Zoom, and add real-time emotion charts. Our goal is to give teachers better feedback at the exact moment they need it.

Built With

Share this project:

Updates