Inspiration

The internet is not designed for every brain. For individuals with ASD, ADHD, and Dyslexia, everyday browsing can quickly become overwhelming due to cluttered layouts, distracting elements, and poor readability. While features like dark mode exist, there is no truly intelligent system that adapts in real-time to a user’s mental and sensory state. This inspired us to create Neuro-Vibe Assistant—a tool that makes the web feel calm, accessible, and personalized for neurodivergent users.

What it does

Neuro-Vibe Assistant is a neuro-adaptive Chrome extension that transforms web browsing based on the user’s real-time cognitive and emotional state.

Detects sensory overload, distraction, or frustration using multimodal inputs (camera, audio, screen). Dynamically adjusts UI elements like colors, layout, and animations. Provides tailored support: ASD: Low-stimulation UI, calm tones, reduced clutter ADHD: Focus mode, distraction blocking, guided attention Dyslexia: Readability enhancements, font changes, text-to-speech Automatically activates protective features like dark mode, tab muting, and simplified layouts. Supports multilingual interaction for accessibility across users. How we built it

We combined modern AI with browser technologies to create a real-time adaptive system:

AI Engine: Gemini 2.0/2.5 Multimodal Live API for processing audio, video, and screen data simultaneously. Computer Vision: MediaPipe Face Landmarker to detect facial cues like stress, attention loss, or fatigue. Frontend: HTML5, CSS3, and JavaScript for dynamic UI adaptation. Extension Framework: Chrome Extension Manifest V3 for seamless browser integration. Real-time Communication: WebSockets for continuous multimodal data streaming. Challenges we ran into Handling real-time multimodal data without performance lag. Ensuring user privacy while using camera and microphone inputs. Accurately interpreting emotional and cognitive states from limited signals. Designing UI changes that help users without being intrusive or distracting. Synchronizing multiple inputs (audio, video, screen) effectively. Accomplishments that we're proud of Built a fully functional neuro-adaptive Chrome extension. Successfully integrated multimodal AI for real-time responsiveness. Created personalized interventions for multiple neurodivergent conditions. Designed a system that improves accessibility beyond traditional tools. Delivered a meaningful solution with real-world impact. What we learned Multimodal AI can significantly enhance accessibility when used responsibly. Simplicity in UI design is critical for neurodivergent users. Real-time systems require careful optimization and efficient data handling. Ethical considerations (privacy, consent) are just as important as technical features. Building inclusive tech requires empathy-driven design thinking. What's next for Neuro-Vibe Improve emotion detection accuracy with better models and datasets. Add wearable integration (e.g., heart rate sensors) for deeper biomarker analysis. Expand to mobile browsers and desktop applications. Introduce user customization and learning preferences. Partner with accessibility organizations and educators for real-world deployment.

Built With

  • accessibility
  • assistive-tech
  • chrome
  • computer-vision
  • css3
  • es6
  • face-landmarker
  • gemini-2.0
  • gemini-2.5
  • html5
  • javascript
  • manifest-v3
  • mediapipe
  • multimodal-ai
  • websocket
Share this project:

Updates