Inspiration

There are a lot of difficulties experienced by people while trying to navigating giant indoor environments such as airports, conferences, hotels, malls etc. You either have to wait for a representative to guide you or hopelessly look for signs leading to the restroom. Maybe you have a connecting flight and it's your first time visiting LAX or you just have to go to pee in WWDC 2018, a solution that plots navigation points on the floor itself might be helpful to you!

What it does

It localizes the user's location using different techniques and then forms a path to the user's destination using AR markers all along the way. It displays a path with an arrow pointing towards turns for you to follow.

How we built it

We built it using Unity, ARCore, Qualcomm Vuforia and of course, C#.

Challenges we ran into

Quite a lot, actually. Some of them were:
1) Not knowing anything whatsoever about AR and how to use it
2) Mapping the RIMAC Arena (We used another AR app for this but later decided to something of our own)
3) Aligning the rendered path with the direction the user is facing
4) Localizing the user in an indoor environment where GPS doesn't help
5) Scaling the virtual environment to match the real world

Accomplishments that we're proud of


1) When developed further, this scalable system consists of two parts. One for the host and one for the client. The host can go around the vent locations marking points of interest which can later be retrieved by all clients attending the event
2) Here in RIMAC, we can take you to the restroom using this system if you tell us your table number!
3) As we speak, we are making a "Pacman" game mode is under development. We thought if we have a bunch of balls tracked on to walkable surfaces, might as well have some fun out of it too!

What we learned


1) Several services that support AR such as ARCore, Vuforia, 8thwall, etc.
2) Using several orientation sensors available on smartphones
3) Integrating and setting up different working environments

What's next for Augmented Reality assisted Hyperlocal Navigation


1) Using IR beacons to emulate larger environments such as airports by dividing the virtual scene into several smaller ones.
2) Development of a point of interest marker application
3) Shared AR experiences such as marking the live location of important people/things so as to meet/get to them
4) Collecting "walking data", that is, how many people go to which places. This data can be highly useful to malls and other commercial institutions to better arrange items within an environment to get in more profit. Even airports can benefit from such data, that is, they can know where exactly they need to put flat escalators to improve commute.
5) The system can also be repurposed to help blind people navigate their homes by adding audio cues to steps and turns.


PS. Obviously, better UI too!

Built With

Share this project:
×

Updates