We were interested in Fannie Mae's challenge to develop an app that matches prospective home buyers with a house that fits their pricing range and size intuitively in augmented reality.
What it does
Virtual Realestate allows the user to explore a map of nearby houses in augmented reality displayed on a floor, table, or other flat surface. The user can select houses to view more detailed information or manipulate the map to change areas.
How we built it
We build the app using Swift for the iOS platform to integrate Apple Maps into augmented reality and used the Zillow API to find and display information about each house.
Challenges we ran into
Some challenges that we ran into was displaying the map on the ground without it stuttering and incorporating the information that we obtained from the Zillow API in the graphics on the map. We originally tried it on an android platform, but the Google Maps API wasn't working, so we had to switch to swift instead.
Accomplishments that we're proud of
We ended up with a successful product that both shows a map in augmented reality on a flast surface and also included sufficient information on each house after they click on it on the map. Customers that use the app are now able to use it in order to look at houses they are interested in based on the information that we provide for them on the house.
What we learned
We learned how to render an app view inside of an augmented reality app and layer UI elements in real time over both the camera feed and map. We also used the Google Cloud Platform to sync map locations and anchors with multiple users so that people can collaborate and look at the same map and houses to make a purchase decision.
What's next for Virtual Realestate
We are interested in doing more advanced analysis on the houses displayed in the app, including highlighting those that fit a certain user's preferences and use machine learning to automatically rate the "fit" of a house for a user.