Inspiration

LINK TO PPT - https://drive.google.com/file/d/1IGsFo71twWQ_OFajoJbS9xKGLf-sFPgO/view?usp=sharing

During a visit to my nani’s house, I watched my four-year-old brother cry inconsolably for a long time. Despite our efforts, none of us could understand what he needed. It was only later that we realized he was simply hungry but couldn’t express it. This moment struck a chord—seeing how a young child’s unspoken needs could cause distress made me wonder: What if we could develop a system that reads brain signals and translates them into understandable emotions and basic needs?

Later, at a care center for children with special needs, I observed many individuals struggling to communicate their feelings and needs. This inspired me to create a solution that bridges the gap between unspoken emotions and caregivers’ understanding.


What Syntra Does

Syntra is an innovative assistive AI system designed to interpret EEG brainwave data and translate it into clear, caregiver-friendly insights about an individual’s emotional states and basic needs. Using simple, intuitive visuals—emojis, icons, and straightforward text—it provides real-time predictions like happy, sad, calm, anxious, or needs such as hungry, tired.

This system aims to empower caregivers and families to better understand and respond to those who cannot verbally communicate, fostering more compassionate and effective care.


How We Built It

Data Processing & Machine Learning

  • Collected raw EEG signals from multiple channels (32 channels, 8000+ data points).
  • Preprocessed EEG features using signal filtering, normalization, and feature extraction techniques in Python with NumPy and scikit-learn.
  • Trained a robust Support Vector Machine (SVM) classifier to accurately predict emotional and need states from the features.
  • Saved the trained model as a .pkl file for deployment.

Backend Development

  • Developed a lightweight backend server using Flask that exposes an API for real-time predictions.
  • The backend accepts EEG data uploads, processes them through the ML pipeline, and returns the interpreted emotional state or need.
  • Ensured the API is optimized for quick responses, making real-time interactions feasible.

Frontend Design

  • Crafted a clean, accessible web interface using HTML, CSS, and JavaScript.
  • Designed a simple upload system for EEG data files, paired with clear visualization of predictions via emojis, icons, and minimal text.
  • Implemented dynamic charts and feedback elements to make the data and predictions easy to interpret for caregivers and family members.

Deployment & Hosting

  • Hosted the frontend on GitHub Pages, ensuring free, reliable access for users worldwide.

Challenges We Overcame

  • Data Collection & Testing: Limited access to live EEG hardware initially made testing challenging.
  • EEG Signal Understanding: Learning how different brainwaves correlate with emotional states required extensive research and experimentation.
  • Mapping Abstract Data: Converting complex brain signals into simple, actionable insights demanded careful feature engineering and model tuning.
  • Keeping It Practical: Balancing technical sophistication with accessibility meant designing an interface that’s both user-friendly and informative.

Achievements & Milestones

  • Developed a fully functional ML pipeline capable of classifying emotions and needs from EEG data.
  • Created a caregiver-centric web dashboard that simplifies complex brain data into intuitive predictions.
  • Transformed a deeply personal inspiration into a tangible prototype that can significantly impact lives.
  • Demonstrated how AI can serve as a bridge for communication, especially for those who are non-verbal or have difficulty expressing themselves.

Key Learnings

  • The importance of meticulous preprocessing and feature extraction in EEG signal classification.
  • The critical role of empathy and accessibility in designing AI systems for vulnerable populations.
  • How to effectively integrate Python-based machine learning models with a simple, static frontend via RESTful APIs.
  • Real-world AI solutions must prioritize human needs and usability over complexity.

What's Next for Syntra?

  • Real-Time EEG Integration: Incorporate portable EEG hardware for live, continuous monitoring and prediction.
  • Expanded Emotional & Needs Categories: Broaden the system’s vocabulary to include more nuanced emotional states and physical needs.
  • Collaborations & Field Testing: Partner with care centers and therapists to gather feedback, refine accuracy, and validate usability in real-world settings.
  • Mobile App Development: Extend Syntra into a mobile application for caregivers and families, making it accessible anytime, anywhere.
  • Multilingual & Voice Output: Add support for multiple languages and enable voice output, allowing caregivers to hear the translated needs aloud for immediate response.

Technical Stack Overview

  • Backend: Python, NumPy, scikit-learn, Flask API for real-time predictions.
  • Frontend: HTML, CSS, JavaScript for an intuitive, accessible user interface.
  • Hosting: Frontend deployed on GitHub Pages; backend hosted on a lightweight server environment.

This project embodies a heartfelt intersection of AI technology and human empathy, aiming to give voice to those who need it most. With continued development and collaboration, Syntra holds the potential to revolutionize how caregivers understand and support individuals with communication challenges.


Frontend is live on github pages! Check it out!

Share this project:

Updates