Inspiration

Our inspiration for Autism-connect came from the growing need for personalized, accessible support for individuals with autism in social situations. We recognized that many people on the autism spectrum struggle with interpreting social cues and navigating complex social interactions. By leveraging AI and mobile technology, we saw an opportunity to create a discreet, real-time tool that could provide immediate support and guidance, empowering users to build confidence and improve their social skills.

What it does

Autism-connect is an AI-driven mobile web application that serves as a personalized digital social coach for individuals with autism. The app uses computer vision to detect facial expressions and body language, and natural language processing (NLP) to analyze live conversations. It provides real-time, subtle feedback to users during social interactions, helping them interpret social cues and respond appropriately. Key features include:

  1. Real-time facial expression detection
  2. NLP-based conversation analysis
  3. Customizable, non-intrusive feedback delivery
  4. Progress tracking and historical analysis
  5. Personalized user profiles with customizable interface options

How we built it

We built Autism-connect using a modern, scalable tech stack:

  1. Frontend: React.js with TailwindCSS for a responsive, mobile-first design
  2. Backend: Node.js with Express.js
  3. Database: MongoDB Atlas for cloud-hosted data storage
  4. Authentication: Auth0 for secure user management
  5. AI/ML: face-api.js for facial expression detection, Hugging Face Transformers for NLP
  6. Sensitive Data Encryption: Midnight blockchain
  7. Deployment: Amazon AWS (EC2, Lambda, S3) for scalable cloud hosting System Architecture

Challenges we ran into

  1. Balancing real-time performance with accuracy in AI models on mobile devices
  2. Ensuring user privacy and data security while providing personalized feedback
  3. Designing an intuitive, accessible interface for neurodiverse users
  4. Implementing non-intrusive feedback mechanisms that don't overwhelm users
  5. Integrating multiple AI models (facial expression and NLP) seamlessly

Accomplishments that we're proud of

  1. Successfully implemented real-time facial expression detection and conversation analysis
  2. Created a mobile-first, accessible design tailored for neurodiverse users
  3. Integrated blockchain technology for secure, encrypted storage of sensitive user data
  4. Developed a scalable architecture capable of handling real-time processing and feedback
  5. Designed a customizable feedback system that adapts to individual user preferences

What we learned

  1. The importance of user-centered design in creating accessible technology
  2. Techniques for optimizing AI model performance on mobile devices
  3. Strategies for ensuring data privacy and security in healthcare-adjacent applications
  4. The complexities of interpreting and providing feedback on social interactions
  5. The potential of AI and mobile technology to support individuals with autism

What's next for Autism-connect

  1. Expand the range of detectable social cues and expressions
  2. Implement more advanced NLP models for deeper conversation analysis
  3. Develop a companion app for caregivers or therapists to track user progress
  4. Integrate with wearable devices for more discreet feedback delivery
  5. Conduct extensive user testing and gather feedback from the autism community
  6. Explore partnerships with autism support organizations and educational institutions
  7. Implement machine learning to personalize feedback based on individual user patterns and preferences

Citations: [1] https://ppl-ai-file-upload.s3.amazonaws.com/web/direct-files/33048041/71e5a5d0-d7af-4daa-8cca-7c8cf4897337/paste.txt

Built With

Share this project:

Updates