๐ Inspiration
With the rapid rise of spatial computing and XR, natural handwriting inside immersive environments is becoming essential. Traditional drawing tools capture strokes โ but they donโt understand intent.
I wanted to explore how MX Ink could evolve from a simple stylus into an intelligent spatial productivity tool. The vision was to combine natural 3D writing with real-time AI understanding to make interactions inside Meta Quest smarter and more meaningful.
โจ What it does
This project enables users to draw naturally in a Meta Quest 3D environment using MX Ink and instantly generate an AI-powered summary of their handwritten content.
With a single key press, the system analyzes the drawing using the Hugging Face Inference API and returns a meaningful interpretation.
๐ Key Capabilities
- โ๏ธ Real-time handwriting capture in 3D space
- ๐ค AI-based drawing interpretation and summarization
- ๐ฎ Smooth spatial navigation (W/A/S/D controls)
- ๐งน One-click canvas clearing with live status feedback
- โก Fast and responsive XR interaction loop
๐ How we built it
The immersive environment was developed in Unity, with interaction logic written in C#. Development and API integration were handled in VS Code, and the Hugging Face Inference API powers the AI summarization.
๐ System Workflow
- Capture drawing strokes from MX Ink input
- Render strokes dynamically on the Unity canvas
- Convert the drawing into an image payload
- Send the image to the Hugging Face model
- Display the generated summary in real time
โ ๏ธ Challenges we ran into
Building a smooth spatial-AI pipeline came with several challenges:
- Ensuring fluid and natural stroke rendering in 3D
- Converting freehand drawings into AI-ready input
- Managing real-time API latency
- Synchronizing movement and drawing interactions
- Maintaining performance inside the XR environment
Each challenge pushed us to optimize both the interaction pipeline and the AI workflow.
๐ Accomplishments that we're proud of
- โ
Successfully integrated MX Ink + Meta Quest + AI
- โ
Achieved real-time drawing summarization
- โ
Built a responsive and intuitive XR experience
- โ
Designed a scalable foundation for spatial AI tools
- โ
Demonstrated practical productivity use cases
๐ What we learned
This project provided deep hands-on experience in:
- XR interaction design in Unity
- Real-time stylus input handling
- Hugging Face inference pipeline
- Performance optimization in immersive apps
- Designing AI-assisted spatial workflows
Most importantly, we learned how AI + spatial computing together unlock powerful new interaction paradigms.
๐ฎ What's next
We see strong potential to evolve this into a full spatial productivity platform. Planned improvements include:
- โจ Improved handwriting recognition accuracy
- ๐ Gesture-based controls using MX Ink
- ๐ฅ Multi-user collaborative whiteboard
- โก On-device AI for ultra-low latency
- ๐ Integration with note-taking and productivity tools
- ๐ง Context-aware AI understanding of diagrams
MX Ink + AI is just getting started.
Log in or sign up for Devpost to join the conversation.