Menu to add objects, turn on floor plan
Screenshot of some signs in AR
When we visit new places, whether airports or museums or just new buildings around campus, it can be difficult to find out where the useful things are, like water fountains or restrooms. We thought it would be really cool and useful to simply be able to look through your phone's camera and see where things are in augmented reality to help find your way around.
What it does
Our app shows maps of building interiors in augmented reality. So far, it features
- Water fountains
- Storke Tower
And even floor plans of the building. By giving you a life-size map and the ability to see points of interest through walls, our app makes navigating new buildings easy.
Event planners, building managers, and other users can add their own "signs" over key areas right from the app, allowing us to crowdsource map generation.
How I built it
We built this app using a Firebase serverless backend and Expo.io (a platform for React Native development) for the frontend. We used react-native's geolocator GPS service and the device's orientation in 3D space to calculate the positioning of our maps, given their absolute latitude, longitude, and altitude, relative to your device. Our custom algorithm, arguably the first of it's kind, lets us map the real world to the AR world using geolocation alone. We used Firebase's real-time database storage to store all our map data. Firebase's serverless API allowed us to skip the middleman of a server, and let our app talk directly to the datastore. Real-time updates and Google scalability came effortlessly with Firebase.
Challenges I ran into
Large, highly precise AR environments are practically unheard of, and this project showed us firsthand the challenges that come with using AR to its full potential. Mapping real-world latitude and longitude to the unitless, relative coordinates of an AR framework is a tedious, difficult, and (until-now) unsolved problem. The math involved in calculating the spatial coordinates of all the signs relative to our device was definitely the most challenging part. We used an embarrassing amount of paper navigating the trig functions, graphs, and other various calculations involved. The process was made even more difficult by the unpredictable and almost completely undocumented behaviors of our 3D renderer. Reconciling the device's, the AR service's, and the real world's axes was definitely the most time-consuming aspect of this project. ARKit is a relatively new software for iOS so we had to battle through a lot of bugs in order to get working software.
And once our algorithms were complete, we ran into another problem. For reasons unknown to us, Google, or StackOverflow, the React Native geolocation services were wildly inaccurate! At less than 12 hours before the deadline, we were facing GPS errors larger than our entire map with no solution in sight. It was only through arduous trials of many hacky, stopgap solutions that we were able to compensate for enough of the error to make our map work.
Accomplishments that I'm proud of
We've done something really, really novel. A lot of hackathon projects that people, us included, do generally involves using a lot of external libraries, learning their API endpoints, and just calling those for functionality. In this project, we wrote the algorithms from scratch, to produce a cool app utilizing augmented reality in a way that's never been done before. The technologies we used are so new we had to face most challenges on our own, without the help of Google or StackOverflow.
Also, despite using it for the first time, Firebase was user-friendly and easy to add to our project, so we're happy of how quick that was.
What I learned
What's next for PathfindAR
We want to add more "signs" and allow the option to customize your own instead of selecting presets. We'd also like to improve the calculations to make the AR more accurate, and make rendering prettier.