🧬 EduGenome AI

Decoding How Students Learn


💡 Inspiration

Education has always been centered around one simple outcome: evaluation. Exams, grades, percentages, ranks, GPA — all of them try to answer a single question:

“Did the student learn the content?”

But during our own learning journeys, we realized that this question is incomplete.

We noticed something that almost every student and teacher experiences, but very few systems acknowledge:
two students with the same score often learn in completely different ways.


One student may:

  • Understand concepts quickly but lose focus easily
  • Panic under time pressure yet deeply understand the topic
  • Learn best through visuals but struggle with text-heavy material

Another student with the same score may:

  • Be consistent, methodical, and slow
  • Have strong memory but weak abstraction
  • Perform well only when confidence is high

Yet traditional education systems treat both students as identical.

The deeper issue is that education systems focus on outcomes, not behavior.


In physical classrooms, good teachers unconsciously read behavioral signals:

  • Facial expressions
  • Eye movement
  • Hesitation
  • Body language
  • Engagement levels

But when learning moved online, all of this behavioral context disappeared.

Students now learn behind screens where:

  • Confusion is invisible
  • Fatigue is unnoticed
  • Learning style is unknown
  • Anxiety goes undetected

This silent gap between what students learn and how they learn became the central inspiration behind EduGenome AI.


We asked ourselves:

What if learning systems could understand students the way good teachers do — by observing behavior, not just answers?


That question gave birth to EduGenome AI.


🧠 What It Does

EduGenome AI is an AI-powered behavioral learning intelligence platform that constructs a Learning Genome for each student — a dynamic, real-time, multi-dimensional profile describing how a student learns.

Instead of focusing purely on correctness or scores, EduGenome AI analyzes learning behavior in real time and transforms it into meaningful cognitive and behavioral traits.


🔹 Core Capabilities

EduGenome AI:

  • Tracks real-time behavioral signals during learning
  • Uses eye tracking, response timing, and interaction analysis
  • Applies machine learning models to compute 24 learning traits
  • Visualizes those traits through a dynamic Genome Wheel
  • Generates personalized learning insights

🧬 The Learning Genome

The Learning Genome consists of 24 traits, grouped into four domains:


🧠 Cognitive Traits

How the brain processes information
(e.g., pattern recognition, memory retention, abstract thinking)


🎭 Behavioral Traits

How the student behaves emotionally and psychologically
(e.g., focus stability, persistence, confidence drift)


🎨 Learning Style Traits

How the student prefers to learn
(e.g., visual, auditory, textual, interactive)


Performance Traits

How the student performs under pressure and difficulty
(e.g., fatigue rate, speed-accuracy balance, difficulty adaptation)


Each trait is scored on a 0–100 scale and continuously updates as the student learns.


🌍 Real-World Example

A student scoring “average” might actually:

  • Have strong conceptual understanding
  • Lose focus after 15 minutes
  • Learn best with diagrams
  • Experience rapid fatigue
  • Struggle with confidence after mistakes

EduGenome AI uncovers these hidden patterns and converts them into clarity, insight, and action.


🛠️ How We Built It

EduGenome AI is built as a full-stack, real-time AI system designed for scalability, low latency, and interpretability.


🔹 Frontend

  • React.js for modular, component-based UI
  • Tailwind CSS for rapid, clean interface design
  • D3.js for the Genome Wheel visualization
  • WebRTC for secure webcam access
  • Mediapipe for real-time eye tracking

The frontend handles:

  • User interaction
  • Quiz delivery
  • Webcam processing (local)
  • Real-time visualization
  • Insight presentation

🔹 Backend

  • FastAPI for high-performance, async APIs
  • Python for ML integration
  • Pydantic for strict validation
  • Uvicorn as the ASGI server

The backend manages:

  • Behavioral data ingestion
  • Feature extraction
  • ML inference
  • Trait scoring
  • API orchestration

🔹 AI & Machine Learning Layer

EduGenome AI’s intelligence comes from multiple ML models working together:

  • LightGBM for predicting 24 learning traits
  • Scikit-learn for confusion detection and fatigue prediction
  • Custom feature engineering (40+ behavioral features)
  • Joblib for fast model loading

Raw behavioral signals are converted into structured features such as:

  • Response time variance
  • Eye movement stability
  • Blink rate
  • Hesitation delta
  • Error recovery speed

These features feed ML models that output interpretable trait scores.


🔹 Visualization Engine

The Genome Wheel, built with D3.js, is the centerpiece of EduGenome AI.

It:

  • Displays 24 radial arcs (one per trait)
  • Uses color coding for trait categories
  • Animates changes in real time
  • Supports hover and interaction
  • Communicates complex AI outputs intuitively

⚠️ Challenges We Ran Into


1. Behavioral Data Is Noisy

Human behavior is unpredictable. Eye tracking data, in particular, can be inconsistent due to lighting, camera quality, or user movement.

We addressed this by:

  • Applying smoothing techniques
  • Designing fallback logic
  • Relying on patterns, not single signals

2. Mapping Behavior to Psychology

Converting raw signals into psychological traits required careful design. We had to:

  • Avoid oversimplification
  • Balance rule-based logic with ML prediction
  • Ensure traits remained interpretable

3. Real-Time Performance

Rendering live animations, running ML inference, and tracking behavior simultaneously required optimization to avoid lag.


4. Privacy and Ethics

Webcam usage raises privacy concerns. We ensured:

  • All eye tracking happens locally
  • No images or video are stored
  • Only anonymous numerical features are processed

🏆 Accomplishments That We’re Proud Of

  • ✅ Created a brand-new educational concept: Learning Genome
  • ✅ Integrated real-time eye tracking in a hackathon project
  • ✅ Built a scientific yet intuitive visualization
  • ✅ Designed 24 meaningful learning traits
  • ✅ Implemented a full AI data pipeline
  • ✅ Delivered a live real-time demo
  • ✅ Balanced psychology, AI, and UX
  • ✅ Built a system extensible beyond the hackathon

Most importantly, we didn’t just build an app —
we built a new way of understanding learners.


📘 What We Learned

  • Learning is behavioral, not just cognitive
  • AI is most powerful when it explains, not replaces
  • Visualization bridges the gap between complexity and understanding
  • Ethical design matters from day one
  • Strong ideas win hackathons more than flashy features
  • Education technology must center humans, not metrics

🔮 What’s Next for EduGenome AI

EduGenome AI is only at the beginning of its journey.


🚀 Planned Future Enhancements

  • 🧠 Expand to 100+ cognitive and emotional traits
  • 🤖 Create AI Learning Twins
  • 🥽 Integrate AR/VR learning environments
  • 🎧 Add voice emotion analysis
  • 📚 Integrate with LMS platforms
  • 📈 Track learning evolution over months and years
  • 🧬 Detect early learning challenges
  • ✨ Use LLMs to generate genome-based personalized content

Our vision is to make behavior-aware learning the global standard.

Built With

  • ai?assisted-prototyping
  • github
  • lovable-ai
  • no?code-ai-app-builder
  • prompt-engineering
  • ux-design
  • vercel
Share this project:

Updates