๐Ÿš€ Inspiration

With the rapid rise of spatial computing and XR, natural handwriting inside immersive environments is becoming essential. Traditional drawing tools capture strokes โ€” but they donโ€™t understand intent.

I wanted to explore how MX Ink could evolve from a simple stylus into an intelligent spatial productivity tool. The vision was to combine natural 3D writing with real-time AI understanding to make interactions inside Meta Quest smarter and more meaningful.


โœจ What it does

This project enables users to draw naturally in a Meta Quest 3D environment using MX Ink and instantly generate an AI-powered summary of their handwritten content.

With a single key press, the system analyzes the drawing using the Hugging Face Inference API and returns a meaningful interpretation.

๐Ÿ”‘ Key Capabilities

  • โœ๏ธ Real-time handwriting capture in 3D space
  • ๐Ÿค– AI-based drawing interpretation and summarization
  • ๐ŸŽฎ Smooth spatial navigation (W/A/S/D controls)
  • ๐Ÿงน One-click canvas clearing with live status feedback
  • โšก Fast and responsive XR interaction loop

๐Ÿ›  How we built it

The immersive environment was developed in Unity, with interaction logic written in C#. Development and API integration were handled in VS Code, and the Hugging Face Inference API powers the AI summarization.

๐Ÿ”„ System Workflow

  1. Capture drawing strokes from MX Ink input
  2. Render strokes dynamically on the Unity canvas
  3. Convert the drawing into an image payload
  4. Send the image to the Hugging Face model
  5. Display the generated summary in real time

โš ๏ธ Challenges we ran into

Building a smooth spatial-AI pipeline came with several challenges:

  • Ensuring fluid and natural stroke rendering in 3D
  • Converting freehand drawings into AI-ready input
  • Managing real-time API latency
  • Synchronizing movement and drawing interactions
  • Maintaining performance inside the XR environment

Each challenge pushed us to optimize both the interaction pipeline and the AI workflow.


๐Ÿ† Accomplishments that we're proud of

  • โœ… Successfully integrated MX Ink + Meta Quest + AI
  • โœ… Achieved real-time drawing summarization
  • โœ… Built a responsive and intuitive XR experience
  • โœ… Designed a scalable foundation for spatial AI tools
  • โœ… Demonstrated practical productivity use cases

๐Ÿ“š What we learned

This project provided deep hands-on experience in:

  • XR interaction design in Unity
  • Real-time stylus input handling
  • Hugging Face inference pipeline
  • Performance optimization in immersive apps
  • Designing AI-assisted spatial workflows

Most importantly, we learned how AI + spatial computing together unlock powerful new interaction paradigms.


๐Ÿ”ฎ What's next

We see strong potential to evolve this into a full spatial productivity platform. Planned improvements include:

  • โœจ Improved handwriting recognition accuracy
  • ๐Ÿ– Gesture-based controls using MX Ink
  • ๐Ÿ‘ฅ Multi-user collaborative whiteboard
  • โšก On-device AI for ultra-low latency
  • ๐Ÿ“’ Integration with note-taking and productivity tools
  • ๐Ÿง  Context-aware AI understanding of diagrams

MX Ink + AI is just getting started.

Built With

  • c#
  • hugging-face-inference-api
  • interaction
  • meta-quest
  • mx-ink
  • toolkit
  • unity
  • vs-code
  • xr
Share this project:

Updates