Inspiration

When we think of the senses, we think of sight, sound, or touch. But the most critical senses for human development are completely invisible. Proprioception (how our brain maps our body in 3D space) and Vestibular processing (how we understand gravity and balance) are often the first things to misfire in children with Sensory Processing Disorder (SPD), Autism, or Dyspraxia.

Because these struggles are invisible, diagnosis is frequently delayed past adolescence—long after the brain's critical neuroplastic window (ages 7-12) has closed. We wanted to build a speculative, future-forward tool that gamifies early detection, capturing biometric markers during the critical childhood window when occupational therapy can actually rewire the nervous system.

What it does

L.U.M.A. (Latent Unconscious Motor Assessment) is a clinical-grade telemetry tool disguised as a premium, atmospheric indie game. While a child plays, the app transforms commercial smartphone hardware into a medical diagnostic suite to invisibly quantify neuromotor deficits.

The application is split into a safe "Gameplay Loop" for the child, and a dual "Telemetry Hub" containing a simplified Parent Dashboard (focused on a Sensory Harmony Score) and an advanced Therapist Dashboard (focused on raw data exports and protocol configurations).

How we built it

We engineered L.U.M.A. as a modern web application optimized for mobile hardware. The entire UI and game loop were built using ReactJS and Tailwind CSS to ensure a hyper-responsive, cinematic "HD-2D" aesthetic without the overhead of a heavy game engine. We utilized Firebase to create a secure authentication gateway for the Therapist Dashboard.

To prove the speculative "Custom Protocol Engine," we built dynamic React state sliders that allow clinicians to override the game's physics in real-time. By connecting standard HTML range inputs to our React state, therapists can dynamically alter variables like fluid viscosity and capacitive touch thresholds, generating hyper-personalized sensory treatments.

Challenges we ran into

The biggest challenge was the visual "identity crisis" between making a fun game and a serious medical tool. We had to aggressively unify our art direction, stripping out generic vector assets and replacing them with a dark, atmospheric pixel-art aesthetic that matched the clinical seriousness of the telemetry dashboards.

Furthermore, balancing complex mathematical state-tracking within React's render lifecycle—while keeping the game running smoothly at 60 FPS—required precise optimization and component architecture.

Accomplishments that we're proud of

We are incredibly proud of the Clinical Telemetry Engine and the mathematical models driving it. Instead of just making a game, we successfully mapped specific hardware inputs to clinical outputs:

  • Vestibular Processing: Calculating Root Mean Square Error ($RMSE$) via the accelerometer to track balance dysregulation.
  • Musculoskeletal Force: Tracking Capacitive Touch Area Variance ($A = \pi \cdot \text{radiusX} \cdot \text{radiusY}$) to detect severe Dyspraxia based on how hard a child presses the screen.

We are also extremely proud of our Hardware Safeguards. To protect children from sensory overload during rotational vestibular tests, we programmed an Auto-Abort feature: if the device tilt exceeds $30^{\circ}$, the simulation instantly shuts down to prevent nausea. Additionally, 100% of the JSON telemetry is processed via local storage, ensuring strict patient data privacy without cloud processing.

What we learned

We learned that the hardware required for the next frontier of the "Quantified Self" movement already exists in our pockets. By creatively combining raw capacitive touch data, standard accelerometers, and advanced mathematical variances, we can extract medical-grade insights from everyday commercial hardware without needing specialized, expensive medical equipment.

What's next for L.U.M.A. - Latent Unconscious Motor Assessment

Our immediate next step for scaling L.U.M.A. is expanding the Custom Protocol Engine. While our MVP utilizes parameter sliders, v2.0 will feature a complete visual node-builder, allowing licensed occupational therapists to drag-and-drop custom sensory levels tailored to a specific patient's diagnostic needs.

Beyond that, we plan to deploy L.U.M.A. into early education environments. By running periodic 10-minute play sessions in schools, we can democratize early detection, identifying neurodivergent children who heavily "mask" their symptoms in public and ensuring they receive support within that critical neuroplastic window.

Built With

+ 6 more
Share this project:

Updates