Inspiration
The challenge prompt asked us to design a tool that tracks, measures, or visualizes an aspect of human sensory experience that is usually invisible or overlooked, while also allowing users to detect, enhance, or manipulate those sensory inputs for better health and self-understanding.
We began by asking: what sensory signal do humans rely on constantly, but rarely notice until it fails?
The answer was spatial orientation and balance: the complex interaction between our visual system, vestibular system, and body awareness (proprioception). On Earth, these systems quietly work together to help us maintain balance and understand where we are in space.
However, in microgravity environments, these signals begin to conflict. Astronauts often experience space motion sickness, spatial disorientation, and difficulty stabilizing their gaze, especially during early phases of spaceflight.
What makes this challenge especially interesting is that the sensory conflict itself is largely invisible and difficult to measure directly. Astronauts feel disoriented, but the underlying sensory mismatch is hard to visualize or track.
This led us to design AstroBalance, a speculative training tool that visualizes and measures sensory alignment between visual cues, head motion, and spatial orientation.
By turning an intangible sensory experience, orientation stability, into measurable visual feedback, AstroBalance allows users to detect, track, and improve their sensory coordination through interactive training exercises.
In doing so, the system transforms a normally invisible sensory process into actionable insights about how humans adapt to extreme environments like space.
What it does
AstroBalance is an interactive training platform designed to help astronauts improve spatial awareness and sensory coordination in microgravity environments.
The system visualizes the relationship between gaze stability, body orientation, and environmental reference frames, helping astronauts learn to manage conflicting sensory signals.
AstroBalance includes five progressive exercises, each designed to target a specific sensory challenge astronauts face in space:
Exercise 1: Visual Stabilization Training: Astronauts move their heads with a shifting horizon while maintaining focus on a fixed target, training gaze stabilization under motion.
Exercise 2: Frame Alignment: Astronauts align their body orientation with a rotating reference frame while a drifting frame gradually aligns with a stationary one, improving spatial awareness.
Exercise 3: Dynamic Orientation Tracking: Astronauts maintain orientation relative to moving environmental cues.
Exercise 4: Multi-Axis Motion Coordination: Multiple axes of motion challenge the astronaut’s ability to interpret conflicting sensory signals.
Exercise 5: Integrated Training Simulation: A full sensory challenge combining gaze control, frame alignment, and motion tracking.
Through these exercises, AstroBalance tracks and visualizes an otherwise invisible sensory process: how well a person’s visual focus, head movement, and spatial orientation remain aligned.
How we built it
We built AstroBalance as an interactive sensory training simulation that transforms invisible spatial-orientation signals into visual feedback that astronauts can interpret and train with.
Our approach focused on three main components: sensory modeling, visualization design, and real-time feedback systems.
Sensory Modeling Through Interactive Exercises: Each training module was designed to represent a specific sensory conflict astronauts experience in microgravity, where visual, vestibular, and body-orientation signals no longer align. We modeled these conflicts using dynamic visual reference frames:
- Moving horizons simulate how visual orientation cues shift in space.
- Rotating and drifting frames represent unstable spatial reference points.
- Fixed gaze targets simulate tasks that require precise visual focus while the body moves.
By combining these elements, the exercises recreate the challenge of maintaining orientation when traditional gravitational cues disappear.
Astronaut Point-of-View Interface: The interface was designed from an astronaut POV, similar to looking through a helmet visor or spacecraft HUD. The central display acts as the primary training field, where astronauts interact with moving reference frames and targets. Around this field, minimal UI panels provide real-time performance data without overwhelming the user.
Key UI elements include:
- Alignment Accuracy Indicator: measures how closely the user’s body orientation matches the reference frame.
- Gaze Stability Tracker: measures how consistently the user maintains focus on a fixed visual point.
- Orientation Feedback Bar: visualizes deviations between expected and actual spatial alignment.
The UI uses high-contrast shapes, simple geometric indicators, and minimal text to ensure information remains readable even during rapid motion.
Real-Time Sensory Feedback: To make the invisible sensory process measurable, we translate user interaction into visual performance metrics. For example:
- When the astronaut aligns their body with the rotating frame, the alignment accuracy score increases.
- When the user maintains gaze on the central dot during head movement, gaze stability improves.
- When orientation drifts away from the reference frame, the UI shows immediate visual feedback.
This continuous feedback loop allows the system to track and visualize the stability of sensory coordination over time.
Progressive Training Architecture: The exercises are structured to gradually increase sensory complexity:
- Gaze stabilization
- Spatial frame alignment
- Dynamic orientation tracking
- Multi-axis motion coordination
- Full integrated sensory challenge
By the final exercise, astronauts must simultaneously stabilize their gaze, align their body orientation, and track moving reference frames, closely simulating the sensory demands of working in microgravity.
Through this architecture, AstroBalance demonstrates how an invisible human sense, spatial orientation stability, can be visualized, measured, and trained through interactive feedback.
Challenges we ran into
One of the biggest challenges was translating an invisible sensory experience into something visual and measurable.
Balance and spatial awareness are normally internal sensations, so we had to design visual metaphors, such as frames, horizons, and alignment indicators, that represent how well different sensory systems are working together.
Another challenge was maintaining realism while keeping the interface readable. If the environment became too chaotic, the simulation lost clarity. If it was too simple, it no longer represented the sensory conflicts astronauts experience in space.
Balancing these two aspects required careful design of motion, feedback systems, and visual hierarchy.
Accomplishments that we're proud of
We’re proud that AstroBalance:
- Successfully visualizes an invisible sensory process: spatial orientation stability
- Designs five progressive training exercises that simulate sensory conflict
- Creates a clean astronaut-style interface focused on clarity and usability
- Demonstrates how sensory data could be translated into actionable feedback for training
Most importantly, we built a concept that shows how digital sensory training tools could complement existing astronaut preparation methods.
What we learned
Through this project, we learned how complex and interconnected the human sensory systems for balance and orientation truly are.
We explored how the visual system, vestibular system, and proprioception interact, and how removing gravity disrupts those relationships.
We also learned how important interface design is when presenting invisible data, turning internal sensory signals into something users can quickly interpret requires thoughtful visual communication.
What's next for AstroBalance
VR and AR Integration: Transform AstroBalance into a fully immersive virtual reality training environment, allowing astronauts to experience more realistic microgravity disorientation scenarios while interacting with 3D spatial cues.
Adaptive Training Algorithms: Implement AI-driven difficulty adjustment that adapts exercises in real time based on user performance, gradually increasing sensory complexity as the astronaut improves.
Applications Beyond Space: Extend AstroBalance to support pilot training, rehabilitation for balance disorders, and vestibular therapy, where similar sensory coordination challenges occur.
Research on Human Sensory Adaptation: Use AstroBalance as a research tool to study how humans adapt to sensory conflict environments, potentially contributing to research in human factors and space medicine.
Built With
- figma
- figma-design
- figma-make

Log in or sign up for Devpost to join the conversation.