AT&T and Magic Leap Hackathon Application

Experience with Mixed Reality and/or Spatial Computing

Greg Madison is a speculative designer and experimentalist. Greg has been involved in spatial computing since 2004. As a former magician, he combines. his expertise in new technologies and his experience as a designer of great illusions, defying the laws of physics to invent new interfaces.

Being a finalist in the Microsoft Imagine Cup 2009 in the category " Human Machine Design Interface " has allowed him to be noticed and to join the R & D team of an IT structure.

Jasmine Roberts is an XR software engineer and designer who has been involved in spatial computing since 2013. Before foraying into augmented and virtual reality, Jasmine studied material physics and electrical engineering in which she solved abstract and computation dimensionality problems. She uses her mathematical prowess and knowledge of 3D space in her spatial computing work. Jasmine is on the organizing committee for MIT's Reality, Virtually Hackathon and was a former Mozilla XR Studio and Oculus LaunchPad member.

Application idea

For this hackathon we would like to create an immersive tool that allows the user to perceive digital signals. We seek to turn invisible data into something legible. This data includes but is not limited to ambient light data, network signal data, magnetic field data, and other types of data that can be detected from sensors on mobile devices. With this tool the user could potential envision radiation fields emitted by microwaves, find where the wifi signal is optimal and to know where the light is optimal to place your plant.

At a macroscopic level, interaction design is involve macroscale and tangible objecets; however fundamental physics interactions like gravity, electromagnetics and strong and weak forces constitute our material world. We know how macroscopic objects like tables, windows and lights affect our lives but we do not have a sense of how these invisible forces effect our lives.

This tool we intend to build leverages the sensing of both the the Magic Leap and iOS sensors. The mobile phone serves as a stand-in for a sensor that is gathering world data. Using Magic Leap's tracking we can justify where the mobile phone is in space. Using the Unity Engine, we can interpret the data and make meaningful visualizations and of these invisible data fields.

This tool can expand int wearables in which we could quantify where users feel peaceful and show a heat map of emotions within a given space.

Feasibility of application idea

Our application is feasible, and we have already begun thinking of the tools, frameworks and systems we would need to fully implement our prototype. iOS phones provide the reception data and position while the Magic Leap affords visualization and interaction.

Can the idea be built with minimum functionality in the time allowed?

The minimum viable product (MVP) can be built within the allotted time. For the MVP of this hackathon (to align with both AT&T's mission and Magic Leap) we will first to visualize network connectivity so users can envision signal strength to know where to position their device (a wireless router) to achieve highest signal.

We will need to parse the data and determine relevant user interactions and interface involved entangle the other information in the same pass. Each spatial voxel of the room will be filled with different states of world with digital information.

Ability to attend the in-person hackathon Nov. 9 - 11

Both Greg and Jasmine have the ability to attend the hackathon from Nov. 9-11


Imgur Image

Built With

Share this project: