Inspiration
Neurological and mental health conditions impact millions of people worldwide, yet early detection often comes too late. Subtle signs of stress, depression, or cognitive decline are usually overlooked until they escalate into serious issues. We were inspired to build an AI-powered companion that detects early signals, connects individuals with caregivers, and delivers personalized cognitive wellness guidance.
What it does
NeuroLens is an AI-driven platform that:
- Detects early signs of neurological and mental health issues using speech patterns, facial expressions, and daily interaction data.
- Connects users with caregivers through a secure dashboard and empathetic AI chatbot.
- Personalizes cognitive wellness plans with actionable insights, lifestyle tips, and progress tracking.
Key features include:
- Real-time cognitive health score
- Interactive mood/emotion diary
- 3D brain visualization where regions light up based on user state
How I built it
- Frontend: Next.js + Tailwind CSS for a responsive and modern UI with real-time graphs and animations
- Backend: FastAPI for processing data and handling AI model inference
AI Models:
- Speech Emotion Recognition (RAVDESS dataset)
- Facial Expression Detection (CNN + OpenCV)
- LLM-powered chatbot for personalized, empathetic conversations
Database: Supabase/PostgreSQL for secure data storage
Deployment: Vercel (frontend) + AWS (backend)
Challenges I ran into
- Combining multimodal AI inputs (speech + face + text) into one seamless platform
- Maintaining data privacy and security for sensitive health data
- Balancing AI accuracy with real-time performance
- Designing a UI that feels both scientifically robust and friendly for everyday users
Accomplishments that I'am proud of
- Built a working prototype that detects emotions from both speech and facial inputs
- Designed an AI chatbot that provides dynamic, personalized wellness support
- Created a 3D gamified brain visualization to make health insights engaging
- Developed a caregiver dashboard for secure patient–family interaction
What I learned
- The power of multimodal AI in capturing subtle human signals
- How to translate complex neuroscience concepts into accessible applications
- The importance of designing with privacy, empathy, and usability in mind
What’s next
- Integrating wearable sensor data (EEG, heart rate, sleep) for richer insights
- Training models on larger, more diverse datasets for better accuracy
- Implementing HIPAA/GDPR compliance for real-world clinical use
- Partnering with healthcare providers and mental health organizations for pilots
Built With
- amazon-web-services
- cnn-models
- fastapi
- llm
- next.js
- opencv
- postgresql
- python
- ravdess-dataset
- supabase
- tailwind-css
- vercel
Log in or sign up for Devpost to join the conversation.