Inspiration
Electromagnetism and vector calculus describe forces that shape the physical world — yet they are taught through flat equations and static diagrams. Students are expected to imagine how invisible fields behave in three dimensions, even though intuition is built through motion, space, and touch.
This project was inspired by a simple question:
What if students could hold invisible forces in their hands?
MIT’s vrmit platform already enables rich 3D visualization of fields in mixed reality. We wanted to take it one step further by connecting those fields to physical interaction — turning abstract physics into something tangible.
What it does
Our system lets users explore electric and magnetic fields in XR by grabbing and manipulating 3D physics objects.
Inside a Meta Quest 3 headset, users use their controllers to hold and rotate virtual bar magnets and horseshoe magnets while seeing electromagnetic vector fields update in real time. As they move and re-orient the objects, the field lines, vectors, and particles respond dynamically in 3D space, building spatial intuition for how forces behave in the real world.
Alongside this, we also designed and fabricated physical proxy objects that mirror the virtual ones, laying the groundwork for future fully tangible interaction.
How we built it
We built on MIT’s open-source vrmit backend in Godot, which provides real-time vector field computation and rendering. Our work focused on:
- Creating and integrating 3D magnet objects for XR interaction
- Implementing electric and magnetic field visualizations
- Designing intuitive grab-and-rotate interactions for learning
- Prototyping physical proxy objects and QR-tagged props for future tracking
Our goal was to bridge physical objects and XR through QR-based tracking. While we built and tested this pipeline, full alignment between real and virtual objects proved challenging within the hackathon timeframe.
Challenges we faced
The hardest challenge was aligning the physical and virtual worlds in real time. Achieving stable, low-latency tracking of physical objects while keeping the XR visualization smooth pushed the limits of our hardware and software pipeline.
We also had to balance scientific accuracy with visual clarity — ensuring the fields were both correct and easy to understand for learners.
What we learned
We learned that embodied interaction dramatically improves intuition for abstract physics. Even when interacting through XR controllers, being able to rotate, move, and inspect field-generating objects in 3D made electromagnetic concepts far more intuitive than static diagrams.
We also saw how powerful open XR platforms like vrmit can be as foundations for future tangible learning tools.
What’s next
With more time, we plan to fully bridge physical objects and XR by completing the QR-based tracking pipeline. Beyond that, we want to:
- Improve tracking accuracy and alignment
- Add more physics objects (charges, currents, dipoles)
- Support multi-user classrooms
- Expand into other vector-based subjects like fluid flow, gravity, and electric potential
Our long-term goal is to make XR a standard tool for learning spatial mathematics and physics — not just something you look at, but something you can touch.
Built With
- godot


Log in or sign up for Devpost to join the conversation.