SyncMove
Extra-Sensory Perception Through Movement Visualization
A design proposal for Figbuild 2026: "Extra-Sensory Perception"
The Challenge
Humans possess around 34 distinct senses, not just the basic five. Yet we remain largely unaware of the hidden senses we depend on daily, specifically proprioception (body position awareness), our vestibular system (balance), and our sense of direction.
Current fitness and wellness apps reduce movement to quantifiable metrics: steps, calories, distance. But this approach strips away the rich, intangible sensory experience of movement itself.
Our mission: Visualize and amplify the invisible: transform the unmeasurable aspects of human motion into tangible, creative feedback that makes users feel their own proprioceptive and spatial awareness.
SyncMove is the answer: a spatial audio application that translates your body's movements into real-time generative music, turning your physical proprioception into a visible (audible) creative instrument.
Vision
SyncMove will be a generative music application that creates a real-time, evolving soundtrack dictated entirely by your physical movement.
Planned Features
- Step-Sync Beat: Every footstep will trigger a percussive beat.
- Spatial Melody: The physical orientation of the phone (tilt, roll, and heading) will shape the melody, pitch, and synth effects.
- Interactive Canvas: While moving, the UI will act as a dynamic instrument. Users can tap the screen (inspired by Patatap and Mikutap) to layer localized sounds, vocal chops, or synth stabs over their generated beat.
How SyncMove Works
1. Design Execution and UX
Our visual prototype demonstrates a high-contrast, dark-mode interface with vibrant, tactile elements designed for peripheral vision while walking. Every interaction from tapping sounds to the beat visualization, and maps directly to physical movement data, creating an intuitive connection between body and interface.
2. Our Goal
Every design decision serves the core mission: amplifying hidden senses. The color scheme evokes energy and movement. The spatial layout mirrors the 3D orientation being tracked. Even the "Select your vibe" feature was intentionally designed to let users choose how intimately they engage with their proprioceptive feedback. SyncMove tells a compelling human story: it transforms a mundane daily walk into a mindful, creative experience. Rather than measuring movement quantitatively, it celebrates movement qualitatively, making users conscious of the proprioceptive and vestibular signals their bodies are constantly generating. The innovation lies in the sensor fusion methodology: mapping 3D acceleration, gyroscopic tilt, and magnetic heading to MIDI parameters in a way that feels musical rather than chaotic. The creative leap is positioning the human body not as a data generator, but as a live instrument: a fundamentally different framing of human movement.
Technical Approach: From Invisible to Audible
Our strategy leverages smartphone inertial sensors to detect and visualize proprioceptive feedback in real-time. By combining these sensors through custom sensor fusion algorithms, we translate body movement into MIDI data that the music engine interprets creatively.
Accelerometer: Detecting Impact (Proprioceptive Awareness)
The accelerometer detects the micro-jerk of each footstep, translating the body's proprioceptive feedback of foot impact into drum beats. By isolating dynamic acceleration from gravitational bias, we can trigger beats in perfect sync with the user's natural stride.
Gyroscope: Tracking Rotation (Postural Awareness)
Gyroscopic data reveals how the user's body is oriented in space. Horizontal roll maps to frequency filters (low-pass/high-pass), while vertical pitch controls spatial effects (Reverb, Delay). This makes postural awareness visible through evolving timbre.
Magnetometer: Directional Heading (Sense of Direction)
Compass heading reveals which direction the user is walking. Each new direction triggers a shift in the underlying chord progression, making directional awareness part of the musical narrative.
Design Challenges: Making the Invisible Visible
The fundamental challenge was conceptualizing how to visualize proprioception, something completely intangible and invisible. How do you show body position awareness? How do you translate the unmeasurable into meaningful creative feedback?
Visual Representation
We needed to create visual metaphors that make abstract 3D spatial data intuitive. Our solution: using particle effects, color shifts, and frequency visualizers that directly correlate to the specific type of movement being detected (impact, rotation, heading).
Avoiding Cognitive Overload
Users must be able to feel their movements instantly while remaining focused on their physical journey. We designed a minimal interface with high-contrast, vibrant feedback on a dark canvas; information that can be perceived peripherally without demanding visual attention.
Accomplishments in the Visual Prototype
We successfully created a visual prototype that makes proprioception perceivable, transforming an invisible, unmeasurable sense into a tangible, creative experience.
What We Learned
Through designing the visual prototype, we've validated a powerful insight: making the invisible visible creates immediate emotional engagement.
When users see their body's movements instantly transformed into music, they become hyperaware of their proprioceptive feedback, something they usually ignore. This awareness transforms mundane movement into a creative act, fundamentally changing how people perceive their own embodied experience.
We've also learned that intentionality in design is crucial when working with intangible concepts. Every color, every animation, every interaction had to earn its place by reinforcing the core mission. Nothing could be arbitrary.
Moving forward, the technical challenge of sensor fusion and digital signal processing will require deep expertise, but the conceptual foundation has been validated: technology can indeed amplify our hidden senses and make the unmeasurable into something tangible and meaningful.
What's Next
Our immediate next steps are to:
- Develop the Audio Engine: Partner with sound designers to create the generative music algorithms that respond intuitively to movement data.
- Validate Proprioceptive Feedback: User test with real walkers to ensure the auditory feedback genuinely heightens proprioceptive awareness.
Future Directions
- Smartwatch Integration: Free users' hands by moving sensor tracking to a wearable device.
- Multiplayer / Social: Allow users in different locations to sync their movements and co-create music in real-time.
- Expanded Sensory Modes: Explore other hidden senses: vestibular balance, directional awareness and create soundscapes tailored to each.
Built With
- canva
- claude
- figma

Log in or sign up for Devpost to join the conversation.