Inspiration
The term "Children from the Stars" is often used to describe children with Autism Spectrum Disorder (ASD)—individuals who possess beautiful, unique inner worlds but may find the unspoken rules of Earth's social interactions confusing. While their systemizing abilities are often strong, the Theory of Mind (ToM)—the ability to attribute mental states to oneself and others—can be a significant challenge.
I was inspired to build Lumi to bridge this gap. I wanted to move away from sterile, clinical therapy tools and create a warm, gamified "Space Adventure" that breaks down complex facial expressions into learnable, discrete patterns. My goal was to create a safe, predictable digital environment where these children can practice social-emotional skills at their own pace, without the anxiety of real-time social pressure.
What it does
Lumi is a dual-interface web application designed for two distinct users: the Explorer (Child) and the Guardian (Parent).
For the Child (The Explorer): Lumi turns emotion learning into a space exploration game.
- Face Planet (Puzzle Mode): A deconstructionist approach where children assemble facial features (eyes and mouths) to match specific emotions, learning that "Happy" is a system of raised eyebrows and an upward curve.
- Story Nebula (Scenario Mode): Applies "Theory of Mind" by presenting social scenarios (e.g., "Your balloon popped") and asking the child to infer the correct emotional reaction.
- Mirror Star (Imitation Mode): An AR-style camera interface where children practice mimicking expressions. It uses a confidence meter to provide positive reinforcement for their efforts.
For the Parent (The Lighthouse): A comprehensive dashboard to monitor emotional growth.
- Proficiency Radar: Visualizes the child's strengths and weaknesses across four core emotions (Happy, Sad, Angry, Surprised).
- Training Log: Incorporates the ABC Model (Antecedent-Behavior-Consequence). Parents can log daily moods and identify specific triggers (like noise, routine changes, or hunger) to correlate them with learning progress.
How I built it
Lumi was built with a focus on performance, accessibility, and sensory safety.
- Core Stack: I chose to cooperate with Gemini-3-pro ,and we utilized React 19 with TypeScript for a robust, type-safe architecture. The build toolchain is powered by Vite for lightning-fast development.
- Sensory-Safe UI: We used Tailwind CSS to implement a strict "Morandi" color palette (low saturation, muted tones like Soft Star Blue and Sage Green). This was a critical design choice to prevent sensory overload/over-stimulation, which is common in ASD.
- Data Visualization: We integrated Recharts to render the Radar Charts in the parent dashboard, making complex progress data easy to read.
- State Management: We used React's "Lifted State" pattern to synchronize the child's gameplay data instantly with the parent's dashboard without needing a complex backend for this MVP.
- Hardware Access: Utilized the browser's
navigator.mediaDevicesAPI to access the webcam for the Magic Mirror feature.
Challenges I ran into
- The "CSS Collapse": There were significant layout issues where the game canvas would collapse or the parent dashboard would lose its scrollability on mobile devices. I had to let Gemini refactor the main layout from standard block elements to a flex-col strategy with
h-screento ensure the app worked perfectly on different screen sizes without hiding crucial navigation buttons. - Balancing Engagement vs. Stimulation: Designing for neurodiversity is a delicate balance. I wanted the app to be fun (using animations and particles), but too much movement can be distressing for the target audience. So Gemini and I had to carefully tune the "Star Background" and animation speeds to be calming rather than chaotic.
- Simulating AI: For the MVP, creating a responsive "Magic Mirror" experience that felt rewarding without heavy machine learning models was tricky. We implemented a simulated confidence algorithm to demonstrate the user flow before integrating the heavy TensorFlow.js models.
Accomplishments that I'm proud of
- Bilingual Support: I successfully implemented a full internationalization system (English and Chinese) from day one, making the app accessible to a wider demographic.
- The "Lighthouse" Dashboard: I am particularly proud of the Parent Dashboard. Transforming raw game inputs into a clean Radar Chart and a detailed Training Log makes the app genuinely useful for tracking behavioral trends over time.
- Seamless Navigation: Creating a fluid experience where a user can switch between the "Child's Space" and "Parent's Lighthouse" instantly, while maintaining separate aesthetics for each mode.
What I learned
- Accessibility is more than screen readers: I learned that accessibility also encompasses sensory accessibility. Choosing the right colors and fonts (Quicksand) is just as important as code structure.
- React 19 Features: I got hands-on experience with the latest React patterns and efficient state management in a TypeScript environment.
- The importance of Feedback Loops: In gamification, immediate visual feedback (like the shaking head for a wrong answer or the bouncing star for a correct one) is crucial for retaining the attention of children with ASD.
What's next for Lumi
- Real-time AI Integration: I plan to replace the simulated camera logic with Google Gemini API or TensorFlow.js FaceMesh to provide real-time, accurate feedback on the child's facial expressions.
- Cloud Sync: Integrating a backend (like Firebase or Supabase) to allow parents to save progress across devices.
- Voice Analysis: Adding a module to help children recognize emotional tone of voice, not just facial expressions.
- Therapist Export: Adding a feature to export the "Training Log" as a PDF to share with therapists and educators.
Log in or sign up for Devpost to join the conversation.