Inspiration
During last year's World Wide Developers Conference, Apple introduced a host of new innovative frameworks (including but not limited to CoreML and ARKit) which placed traditionally expensive and complex operations such as machine learning and augmented reality in the hands of developers such as myself. This incredible opportunity was one that I wanted to take advantage of at PennApps this year, and Lyft's powerful yet approachable API (and SDK!) struck me as the perfect match for ARKit.
What it does
Utilizing these powerful technologies, Wher integrates with Lyft to further enhance the process of finding and requesting a ride by improving on ease of use, safety, and even entertainment. One issue that presents itself when using overhead navigation methods is, quite simply, a lack of the 3rd dimension. A traditional overhead view tends to complicate on foot navigation more than it may help, and even more importantly, requires the user to bury their face in their phone. This pulls attention from the users surroundings, and poses a threat to their safety- especially in busy cities. Wher resolves all of these concerns by bringing the experience of Lyft into Augmented Reality, which allows users to truly see the location of their driver and destination, pay more attention to where they are going, and have a more enjoyable and modern experience in the process.
How I built it
I built Wher using several of Apple's Frameworks including ARKit, MapKit, CoreLocation, and UIKit, which allowed me to build the foundation for the app and the "scene" necessary to create and display an Augmented Reality plane. Using the Lyft API I was able to gather information regarding available drivers in the area, including their exact position (real time), cost, ETA, and the service they offered. This information was used to populate the scene and deep link into the Lyft app itself to request a ride and complete the transaction.
Challenges I ran into
While both Apple's well documented frameworks and Lyft's API simplified the learning required to take on the project, there were still several technical hurdles that had to be overcome to complete the project. The first issue that I faced was Lyft's API itself; While it was great in many respects, Lyft has yet to create a branch fit for use with Swift 4 and iOS 11 (required to use ARKit), which meant I had to rewrite certain portions of their LyftURLEncodingScheme and LyftButton classes in order to continue with the project. Another challenge was finding a way to represent a variance in coordinates and 'simulate distance', so to make the AR experience authentic. This, similar to the first challenge, became manageable with enough thought and math. One of the last significant challenges I encountered and overcame was with drawing driver "bubbles" in the AR Plane without encountering graphics glitches.
Accomplishments that I'm proud of
Despite the many challenges that this project presented, I am very happy that I persisted and worked to complete it. Most importantly, I'm proud of just how cool it is to see something so simple represented in AR, and how different it is from a traditional 2D View. I am also very proud to say that this is something I can see myself using any time I need to catch a Lyft.
What I learned
With PennApps being my first Hackathon, I was unsure what to expect and what exactly I wanted to accomplish. As a result, I greatly overestimated how many features I could fit into Wher and was forced to cut back on what I could add. As a result, I learned a lesson in managing expectations.
What's next for Wher (with Lyft)
In the short term, adding a social aspect and allowing for "friends" to organize and mark designated meet up spots for a Lyft, to greater simply the process of a night out on the town. In the long term, I hope to be speaking with Lyft!
Built With
- arkit
- core-location
- lyft
- lyft-api
- lyft-sdk
- mapkit
- swift
- uikit
Log in or sign up for Devpost to join the conversation.