Inspiration

In field science (the study of our natural surroundings) measurement and analysis constitute a major bottleneck. Out in the field, time is spent making repetitive spatial measurements (position, distance, size, count, etc.) with specialized tools. After returning to lab, these measurements are then analyzed with computer software. The questions are exciting, but the manual measurement and delayed analysis turn-around are often tedious, monotonous, and even prohibitive.

What it does

An AR application that turns the spatial mapping capacities of devices like Magic Leap into a scientific tool — help scientists ask quantitative, spatial questions and get answers back in a HUD-like fashion in real-time — a quantitative 'sixth sense'.  Devices of this sort will replace many specialized measurement tools in field science and enable a more organic scientific process, less disjointed by the discrete steps of measure->record->analyze.

How I built it

C# -> Unity3D -> Magic Leap Emulator -> Deploy to Device

Challenges I ran into

Because Magic Leap's hardware/software are brand new, there is very little web-searchable content. Instead, to design functions and troubleshoot, we are limited to primary documents (many of which are hidden 'under-the-hood') and asking questions on the forums.

Accomplishments that I'm proud of

I have 5 years of research programming experience in data analysis and agent-based simulation, but before this project I hadn’t ever…
-developed an application
-used Unity (or any other graphics engine)
-written in C# (just Python and Matlab)
-owned/worked with an AR device
So, went from zero application development experience to a functioning proof-of-concept in ~3 weeks.

What's next for Look.

Develop a user-interface as well as general function packages for:
-observation/data storage
-analysis/plotting
-object selection
-modular user-directed tool development
-user-guided meshing protocol

Built With

Share this project:
×

Updates