Inspiration

The idea started when I explored the Resources tab and analyzed existing project directions. Many concepts were already available in open access and relatively easy to reproduce thanks to video tutorials. Instead of repeating common implementations, I became interested in the concept of The Real-Time Teacher, which surprisingly had very few examples online. Most related projects focused on Gemini 2 and were limited to generating educational material with voice narration. However, I realized that adding live video fundamentally changes the learning experience by enabling contextual awareness and adaptive interaction. This insight was also shaped by my personal teaching experience. I have been teaching mathematics, physics, and programming for over 4 years. I remembered situations when I joined Google Meet from a tablet and my camera turned off after screen sharing, I was still guiding the student, almost like an invisible assistant. That experience made me rethink how an AI teacher could exist: present, adaptive, and supportive without necessarily replacing human presence.

What it does

AI Classroom is a real-time adaptive AI teacher that interacts through live voice, video, and visual explanations. The AI listens to the student, observes engagement and distraction, generates visual explanations instantly, and dynamically adjusts the lesson based on behavior and context. Instead of a chatbot, it acts as a continuous learning partner, guiding, adapting, and responding in real time.

How I built it

AI Classroom was built entirely in Google AI Studio using Gemini. Core technologies:

  1. Gemini Live API (gemini-2.5-flash-native-audio-preview-12-2025)
  2. Gemini 3 API (gemini-3-pro-image)
  3. Algorithm for behavior and engagement detection to adapt teaching based on student focus and mood
  4. Frontend: React 19 + TypeScript, Tailwind CSS, Web Audio API and MediaDevices API for live capture
  5. Gemini function calling allows the AI to control the learning interface: opening tools, generating visuals, and managing lesson flow dynamically.

Challenges we ran into

The main challenge was creating an experience that feels alive rather than scripted. Synchronizing live voice interaction, video context, and dynamic UI changes required careful design to avoid latency and maintain natural interaction flow. Another challenge was balancing adaptive behavior detection so the AI feels supportive instead of intrusive.

Accomplishments that we're proud of

I wanted to build more than a chatbot. I built a real-time AI teacher that listens, watches, and adapts as learning happens. AI Classroom uses live interaction, adaptive visuals, and behavior-aware teaching to respond to focus and mood instead of waiting for prompts. It feels less like using AI and more like learning with a responsive partner. Most importantly, this is a product I built for myself, something I truly want to use.

What we learned

I learned that real-time interaction transforms AI from a tool into an experience. Voice, video context, and behavior awareness allow AI to guide learning instead of simply answering questions. Designing AI is not just about model capabilities, but about creating adaptive interactions that feel natural and responsive, something that became possible thanks to Gemini’s real-time multimodal capabilities.

What's next for AI Classroom

Building a real product takes time, and AI Classroom is still evolving. Even during development, some parts of the system were manually refined to ensure a smooth learning experience and demonstrate the full vision. The next steps focus on turning the prototype (MVP) into a fully autonomous and scalable product:

Short-term (next 1–2 months)

  • Automating remaining manual workflows and stabilizing real-time interaction
  • Improving behavior detection for distraction and student mood
  • Refining adaptive teaching logic and lesson flow

Mid-term (3–6 months)

  • Persistent student memory and personalized learning paths
  • Better multimodal reasoning using deeper video context
  • Expanding subject support and teaching styles

Long-term (6–12 months)

  • Integration with real educational platforms (e.g., LMS environments)
  • Collaborative learning sessions with multiple students
  • Developing AI Classroom into a continuous learning companion rather than session-based interaction

AI Classroom is moving from a prototype into a real product designed for everyday learning.

Built With

Share this project:

Updates