Inspiration

The internet is not built equally for everyone. Individuals with ASD, ADHD, and Dyslexia often face sensory overload, distraction, and readability challenges while browsing. We were inspired to create a solution that doesn’t force users to adapt to technology—but instead makes technology adapt to the user’s brain in real time. Neuro-Vibe was born to make digital experiences more inclusive, calming, and personalized.

⚙️ What it does

Neuro-Vibe Assistant is a neuro-adaptive Chrome extension that intelligently adjusts web content based on the user’s emotional and cognitive state.

Detects stress, distraction, and focus levels using facial cues, voice, and screen activity Dynamically modifies UI (colors, fonts, layout) to reduce sensory overload Provides focus tools for ADHD (spotlights, distraction blocking) Enhances readability for Dyslexia (font changes, spacing, text-to-speech) Activates calming modes for ASD (low-arousal visuals, reduced stimuli) Automatically switches to protective modes during sensory overload 🛠️ How we built it

We combined modern AI and web technologies to create a real-time adaptive system:

Gemini 2.0/2.5 Multimodal Live API for processing audio, video, and screen input simultaneously MediaPipe Face Landmarker for detecting facial expressions and behavioral signals Chrome Extension (Manifest V3) for seamless browser integration JavaScript (ES6+), HTML5, CSS3 for UI and interaction WebSockets for real-time communication with AI models Built modular architecture with components for media handling, AI processing, and UI adaptation ⚡ Challenges we ran into Handling real-time multimodal data without latency Ensuring user privacy while processing camera and microphone inputs Designing non-intrusive UI changes that help without distracting Accurately mapping biomarkers to emotional states Managing browser performance with continuous monitoring 🏆 Accomplishments that we're proud of Built a fully working real-time neuro-adaptive system Successfully integrated multimodal AI (vision + audio + screen) Created meaningful features for ASD, ADHD, and Dyslexia support Designed a user-friendly and calming interface Delivered a project with strong social impact and accessibility focus 📚 What we learned Multimodal AI can significantly improve accessibility when applied thoughtfully Real-time systems require careful optimization and efficient architecture Accessibility is not just a feature—it should be a core design principle Small UI changes can have a huge impact on user comfort and focus Ethical AI and privacy considerations are critical in human-centered applications 🚀 What's next for Neuro-Vibe Add personalized learning models for each user Integrate wearable device data (heart rate, stress levels) Expand to mobile apps and desktop environments Introduce offline/low-latency AI models Build a dashboard for tracking focus and well-being trends Collaborate with healthcare and accessibility experts for validation

Built With

Share this project:

Updates