Inspiration

The inspiration came from a project that I was on, Memories, at the Creating Realities Hackathon at USC, Los Angeles, March 2018. Memories won a prize for Best Content with the use of photogrammetry to document a series of memories in a mixed reality photo album. That project can be seen here https://devpost.com/software/creating-realities-memories. The "Mira Me", project also combines my love for space exploration and commitment to STEAM initiatives. I have permission from the author of, Epic Space Adventure, to use his book for creative inspiration to build a story that helps kids learn about our galaxy. Plus the kids get to "be in" the story, enticing their enthusiasm.

What it does

The story will be 100% projected from the Magic Leap headset onto a blank storybook. The reader will still experience the visceral feeling of reading a book, however, the story 'magically' appears on the pages as they are turned. I imagine a parent and child reading this together, both in a headset or perhaps the parent follows along via a screen or phone display. That level of detail TBD.

How We built it

We built the scene animations using Unity, the character and character animations from Mixamo. The character's face was modeled using 3 photos, front and two profiles using FaceGen. FaceGen is a photogrammetry application for Windows 7, 8 or 10, written by Todd Colletti.

FaceGen allows you to create 3D models of human faces from 1-3 photographs (passport style, and optionally, left & right profiles). FaceGen incorporates a statistically derived database created from the geometry of more than 150 prototypical human faces. Using variance analysis, the application selects a model that best matches the calculated geometry derived from key features referenced from each supplied photo.

The book, the surface that the holographic scenes are projected onto was made by hand.

Challenges We ran into

Creating a book interface metaphor that works with a real book for a Magic Leap One device and Unity is a complex problem that will require a combination of technical solutions, some of which require additional support within the Magic Leap Lumin OS SDK and Unity to overcome.

  • The Magic Leap device is designed to work optimally with objects in a mixed-reality setting that is beyond 5-feet from the user wearing a Magic Leap One headset. It is difficult for the device to maintain ‘state’ when control objects are less than 5 feet away from the user. As our use case requires the control object to be at arms-length, this is challenging. However, I have seen successful experiences that achieve arms-length interaction - Moon Bloom
  • The Magic Leap device running with the Lumin OS in conjunction with Unity is limited to ‘static’ control surfaces which must remain stationary with respect to the headset hardware in order to maintain the mixed-reality illusion. The simple act of turning a page violates this ‘static’ control surface requirement. > Realizing this limitation, a team of platform engineers within Magic Leap has developed a set of libraries (GLDS0) that extend the Lumin OS and provide real-time adaptive mesh data to Unity such that changes in headset orientation simultaneous with changes in the orientation of control surfaces can be accommodated in runtime applications. > The GLDS0 libraries are experimental and have not been commercialized, such that they require a significant amount of engineering to integrate into applications. The effort required is much greater than would be possible by one engineer in a 48-hour period of time. > The GLDS0 libraries are extensive, adding a great deal of additional complexity to an already complex set of engineering problems. Compiling the libraries into an application template without content takes four times longer than basic compiles of LuminOS + Unity that do not support the GLDS0 libraries. This additional compile time grows exponentially with the addition of models, animations, audio, and interactive control elements. This complexity made it problematic to iterate and resolve bottlenecks in the context of a time-bound hackathon.

Accomplishments that We're proud of

The building of the team, the teamwork and collaboration demonstrated throughout the project. We were proud of the amount of work we did get done in such a short period, unfortunately, the technical difficulties that we ran into made it difficult for us to exhibit the artwork or the full intention of the idea. Everyone kept a positive attitude and had fun, making learning and growing as a team the focus. Everyone expressed gratitude to have the opportunity to work with the Magic Leap. It was awesome!

What We Learned

We all learned the challenges of developing Mixed Reality. Most us had experience with Virtual Reality and only a small amount if any with Mixed Reality. One has complete control over the environment with VR and creating realities in real reality is considerably more challenging and also considerably more exciting.

What's next for "Mira Me!"

To work closely with the Magic Leap support team through the grant program and take this project on the road in the Mira Me truck!

Built With

Share this project:
×

Updates

Todd Colletti posted an update

Some very good, spirited work from the team here at the Magic Leap Hackathon sponsored by AT&T. Interaction with real-world objects is non-trivial but exciting, especially with regard to animations that must change context when the point of view changes with the head gear. The Magic Leap headset is able to adjust its approximation of the mesh that describes control surfaces, allowing content developers to rely on the Lumin OS for most of the heavy-lifting when placing elements of a scene. It’s something that must be seen first hand to appreciate what might be described as a ‘state-change’ between the common form (traditional-VR) and what Magic Leap represents. Extraordinary technology for an entirely new level of original content.

Log in or sign up for Devpost to join the conversation.