InkSight — Touch Reality. Reveal Insight.

Inspiration

Modern Mixed Reality has unlocked the ability to see and interact with digital objects in our physical environment. However, one fundamental limitation still exists: precision interaction. Most MR interactions rely on hand tracking or controllers, which are excellent for general input but lack the precision required for technical inspection, engineering analysis, and detailed spatial interaction.

When Logitech introduced the MX Ink stylus, it represented something different — not just an input device, but a precision instrument for Mixed Reality.

This inspired us to ask a powerful question:

What if we could inspect and understand the physical world the same way developers inspect code?

In software debugging, we probe variables, trace execution, and reveal hidden states. But in the physical world, we cannot see forces, stress, airflow, or structural relationships directly.

InkSight was born from the idea of turning reality itself into an inspectable system.


What We Built

InkSight is a Mixed Reality engineering inspection and visualization platform powered by the Logitech MX Ink stylus and Meta Quest.

It allows users to probe, analyze, and annotate physical or virtual objects with precision.

Using the stylus, users can:

  • Select and isolate machine components
  • Annotate directly in 3D space
  • Trigger exploded views of complex assemblies
  • Visualize hidden structural relationships
  • Interact with objects at sub-centimeter precision

The stylus transforms from a simple input device into a precision probe for reality.


What We Learned

Building InkSight taught us several key technical and design principles:

1. Precision changes interaction paradigms

Traditional MR input uses gestures mapped to raycasts or volumetric selection. However, the stylus allows direct point-based interaction defined mathematically as:

[ P_{interaction} = O_{stylus} + t \cdot D_{stylus} ]

Where:

  • ( O_{stylus} ) is the stylus tip origin
  • ( D_{stylus} ) is the stylus direction vector
  • ( t ) is the intersection distance

This enables highly accurate spatial selection.


2. Spatial computing benefits from tool-based interaction

Hands are expressive, but tools are precise.

The stylus acts as an extension of intent, allowing:

  • Reduced interaction ambiguity
  • Faster selection time
  • Increased spatial accuracy

This significantly improves user efficiency.


3. Mixed Reality is most powerful when augmenting understanding, not replacing reality

Instead of creating fully virtual environments, InkSight enhances the user's perception of existing objects.

This aligns with the philosophy:

[ Augmented\ Understanding = Reality + Context + Interaction ]


How We Built It

Hardware

  • Meta Quest headset with Passthrough
  • Logitech MX Ink stylus

Software Stack

  • Unity Engine
  • Meta XR SDK
  • OpenXR framework
  • Logitech MX Ink input integration

Core System Architecture

InkSight uses a modular interaction pipeline:

1. Stylus Input Layer

  • Captures stylus pose
  • Tracks position and orientation

2. Interaction Layer

  • Raycasting system
  • Collision detection
  • Object selection logic

3. Visualization Layer

  • Highlights components
  • Displays overlays
  • Renders annotations

Stylus Ray Interaction

We implemented ray-based interaction using Unity’s physics system:

[ Ray(t) = Origin + Direction \times t ]

When the stylus ray intersects an object:

  • Object becomes highlighted
  • Interaction events are triggered
  • Visualization overlays activate

Exploded View System

We implemented component separation using positional offsets:

[ P_{new} = P_{original} + Offset \times ExplosionFactor ]

This allows dynamic inspection of internal components.


Challenges We Faced

1. Precision tracking alignment

Ensuring the stylus tip aligned correctly with virtual raycasts required careful calibration between:

  • Stylus transform
  • Camera transform
  • World coordinate system

Even small offsets created interaction errors.

We solved this using local coordinate correction and transform normalization.


2. Interaction stability

Hand and stylus micro-movements caused jitter in ray interaction.

We implemented smoothing using interpolation:

[ P_{smooth} = lerp(P_{previous}, P_{current}, \alpha) ]

This improved stability and usability.


3. Designing intuitive spatial interaction

We needed to ensure interactions felt natural and responsive without overwhelming the user.

We focused on:

  • Clear visual feedback
  • Immediate response to input
  • Minimal cognitive load

Impact and Future Potential

InkSight demonstrates how precision tools like MX Ink can fundamentally change how we interact with Mixed Reality.

This has applications in:

  • Engineering inspection
  • Technical education
  • Industrial maintenance
  • Medical visualization
  • Scientific research

InkSight represents a step toward a future where humans can directly inspect, analyze, and understand the physical world through spatial computing.


Conclusion

InkSight transforms the Logitech MX Ink stylus into a precision instrument for Mixed Reality inspection.

It bridges the gap between physical and digital understanding, enabling users to probe reality itself.

Touch reality. Reveal insight.

Built With

Share this project:

Updates