Inspiration
Online classes make it hard to read the room, participate, and retain engagement. Instructors often realize students were lost only after a quiz or exam. Our team built zoomED to give teachers the same intuition they have in person: live awareness of attention, participation, and confusion signals during class.
What it does
zoomED monitors live Zoom sessions and turns raw meeting activity into actionable teaching insights. It:
- Tracks attention trends using computer vision (gaze-based attention scoring)
- Streams chat and participation events in real time
- Detects engagement drops and highlights at-risk moments
- Uses AI agents to generate engagement summaries for teachers, gentle nudges to come back to class for students, and adaptive quiz/poll suggestions based on real-time transcripts
- Displays everything in a live instructor dashboard for immediate intervention
How we built it
We built a multi-service real-time system:
- Zoom client app with Zoom Meeting SDK + integration with MediaPipe Face Mesh for attention signals
- WebSocket event pipeline to stream attention/chat/participation data
- Node.js/Express backend to aggregate live meeting state
- Multi-agent AI layer (Anthropic-powered) for summarization, nudges, and quiz generation
- React + Vite dashboard to visualize engagement and recommendations in real time
- JWT authentication endpoint for secure Zoom SDK session access
Challenges we ran into
- Synchronizing multiple noisy real-time signals (CV + chat + participation) into one reliable engagement view
- Zoom RTMS access and setup issues, even when working with an ex-Zoom engineer onsite
- Keeping WebSocket streams stable and low-latency across services
- Tuning attention scoring so it’s useful without being overly sensitive
- Designing AI outputs to be actionable for instructors, not just descriptive
- Managing the complexity of running four local services during rapid demo iteration
Accomplishments that we're proud of
- End-to-end live pipeline from Zoom session -> engagement signal -> AI recommendation -> instructor dashboard
- Real-time attention event streaming working during live calls
- Multi-agent architecture that produces different types of classroom interventions
- A practical demo that feels immediately useful for educators, not just technically impressive
- Modular architecture that can scale to richer analytics and interventions
- While tailored towards the education sector, can be expanded into the workplace as well (as said by TreeHacks mentors, thank you for your insights!)
What we learned
- Real-time educational feedback is as much a product-design problem as an AI problem
- Combining multimodal signals gives better engagement insight than any single metric
- Fast iteration loops (instrumentation + observability) are critical for live systems
- Building for trust, transparency, and instructor control is essential in edtech AI
What's next for zoomED
- Personalize interventions by class style, subject, and learner profile
- Add longitudinal analytics across sessions (weekly trends, concept-level struggle maps)
- Improve model calibration and fairness across diverse camera and classroom conditions
- Pilot with real instructors and measure outcomes like participation lift and retention gains
- Linkage to "away" feature to allow students to not be badgered with notifications and questions when away from their devices
Built With
- claude
- express.js
- javascript
- node.js
- react
- websocket
- zoom
Log in or sign up for Devpost to join the conversation.