We discovered that productivity tools measure time, tasks, and output. But they ignore the body. KineticSense introduces a new sense: the ability to understand what your movement is telling you about how you work.
We translate movement into meaning.
Step 1 — Capture Instrument the body. Collect raw movement data across proprioceptive, interoceptive, and kinesthetic channels.
Step 2 — Classify Train the model to distinguish regulation states — cognitive engagement, boredom drift, stress, nervous energy, and overload from noise.
Step 3 — Translate Convert classified movement into meaning. Close the loop with haptic feedback. The body that generates the signal receives the response.
We used Figma Make to create a prototype that measures fidgeting and provides necessary haptic feedback to the individual based on their desired performance across various social and performance scenarios.
Built With
- figma
Log in or sign up for Devpost to join the conversation.