Inspiration

The inspiration for this project originally came from our own personal experiences and hardships using the LA Metro subway system. As it stands right now, we found that the system as a whole was pretty hard to navigate, and that it’s hard to have a spontaneous adventure to explore a specific station. To validate this hypothesis we decided to launch a google survey on social media that asked people to rate various factors of the LA Metro Subway System. The results were pretty shocking. For the question "How efficient is the current metro system in terms of getting from point A to point B?", only 1 person (out of 51 respondents) gave a 5 out of 5. The average response score on this question was a dismal 2.71. For the following question "How would you rate the metro system overall?", nobody answered with a 5 out of 5 and the average score ended up at 2.82. All of this really motivated us to create a simple and unique solution that would both help users better navigate the metro system and also understand the value of individual stations.

What it does

Our solution that we ended up making was Metro AR. Metro AR is designed to help out the end user by using several small features that are tied into two overall views, an overall map view and a station map view. The overall map view is the first thing that a user sees, and this view displays the full LA metro map laid out with its corresponding lines and keys. The user can then scroll around and zoom in and out to view the map as they want. The user can also focus on specific lines on the map by tapping on the lines within the key and toggling the map keys on and off. There is another button on the left side with a curved arrow that is responsible for providing directions. Upon pressing that button, a user sees a popup where they can enter a start station ("from" field) and a destination station ("to" field) and then press submit. This will then create another popup that shows the text directions from the entered start station to the entered destination station. This feature in conjunction with the toggle line feature can help a user really understand the details of their planned trip and how to effectively get "from point A to point B". A user can also click in on specific stations, and that ultimately brings them to the second station map view. Similar to the overall map, a user is able to scroll around and zoom in/out on this station map. On this view, there is a little knob at the bottom with four different icons that all control the state of a pop up box. Selecting the first icon turns the pop up box invisible if it is visible. Selecting the second icon shows the minutes and seconds until the next train arrives at the station. A train also appears on the station map when the timer on this screen drops to 0. The next two icons are meant to help the user understand the best parts of the area that station is located in. Selecting the third icon shows the top ranked attraction in the area, and selecting the fourth (and final) icon shows the top ranked restaurant in the area.

How we built it

This whole project was built out with the assistance of two major tools: Blender and SparkAR. We used blender to create some of the amazing assets used in this project from scratch. The full map of the LA Metro Subway system (with all of its lines, stations, and other intricacies) was completely created and styled from scratch in blender. The keys on the sides of the overall map (showing the line colors and map details) were also created from scratch in blender. Finally, the metro train asset that pops up as the next train timer hits 0:00 on the station map screen was modeled and textured in blender. On the programmatic side, we experimented with several different SparkAR features to create the various interactions within the project. The native patch editor was used to zoom and scroll the different maps as well as toggle the metro lines on and off. From there, scripting was used for all of the more complex interactions and functionalities. The TouchGestures module was used to check in on whether or not a user tapped on different UI components (ex. The direction box, and individual stations). The Reactive module was used to constantly monitor the time that had passed since the filter got opened (from a Time Patch module), and the status of the choice made on the station screen (from a Native UI picker). The patches module was used to read in those two values from the patch editor as well as to output whether or not the Native UI picker would be visible. The NativeUI module was used to allow the user to input start and end locations in the directions feature. We wanted to use the Networking module to connect to API services such as the LA Metro API and the TripAdvisor API, but that module was deprecated due to security issues. As a workaround, we built out a "pseudoAPI" within the script as the single source of truth for the data used in directions, best attraction, best food, and time until the next train.

Challenges we ran into

There were actually a lot of challenges that we ran into because we were beginners at both SparkAR and Blender. At the start of this competition both of us had 0 experience working with AR development in SparkAR and 0 experience working on asset development with Blender. Even though both of us were inexperienced, we have always admired advances in augmented reality and 3D asset development from afar. When SparkAR beta was announced a while back, we both signed up, but we never found the time to start and really dive in until this hackathon. Besides the natural hurdles that typically come with learning a new software, SparkAR posed us with some unique challenges. For starters, SparkAR uses a specific style of javascript where everything is reactive (not linear), and data is transmitted as streams rather than constants. Wrapping our minds around just the basics of that proved to be tricky. Even past the basics, we ran into quite a few hurdles, and interestingly enough, most of those hurdles were overcome with the help of the native documentation (as opposed to forums like StackExchange). One problem that we had to creatively overcome was the deprecation of the Networking module. One of our big value propositions was supposed to be access to real time data (such as the time until the next train), and without Networking, we weren't able to hit the API endpoint required. To overcome this issue, we "hardcoded" a pseudoAPI inside the script to give us the ability to easily add and manipulate data as well as simulate a json response. Another big challenge that we faced came in the form of deploying our project to Spark AR Hub. We tried using a multitude of different facebook accounts and different browsers, but we were still met with an “Unknown Error” message on the final screen. After a lot of repetition and persistence, we finally got the project to pass through and submit. Even though the project is submitted and the link is now available, the filter itself occasionally crashes on the facebook app (maybe due to different phones). If you are having trouble loading and running that demo link, you can feel free to check out this project on the linked github repository and download it from there.

Accomplishments that we're proud of

This section heavily overlaps with the previous section because our biggest challenge forced us to learn SparkAR and Blender from scratch in order to succeed. Learning SparkAR was a pretty interesting experience because, in terms of script debugging, it proved to be different than learning a lot of other programming languages. This is because there weren't a lot of natural side resources to help (ex. StackExchange, Youtube, etc.) outside of the native documentation. One example of this was editing and reading from the editable text feature. We had a little bit of trouble finding out how to actually edit the editable text field and then read from it. After a bit of trial and error, we looked online for help, and outside of one page on the SparkAR documentation, there were no helpful and related resources. Ultimately though, this hackathon was great for both of us because it pushed us to learn this AR skillset that we have looked at and admired for quite some time now.

What we learned

This section also overlaps with both of the previous sections. We are both really proud of this learning opportunity that we gained by entering this hackathon. We can now take these abilities to create 3D assets and AR situations and use it in different projects that we want to pursue. After the experience from this hackathon, both of us agreed that we want to do a lot more SparkAR projects both casually for fun, and competitively for all the upcoming SparkAR hackathons.

What's next for Metro AR

MetroAR is a cool personal project but there is still a lot of work that needs to be done before it is ready to be deployed in the real world. First off, we need to connect up all of the metro stations both to the script and to appropriate station map assets. Right now, only Arcadia and 7th Street/Metro Center actually hold relevant data and link up to a corresponding station map. Since the framework is basically set up (via the internal pseudoAPI), adding stations should be fairly easy. Secondly, we would like to build out the directions feature in a more robust and intuitive manner. Right now, text directions are displayed whenever a user enters a start and end location (based on our very limited internal API). We need to build this feature out so that a user can find the directions between any two stations and the route returned from the directions is highlighted and easy to understand. Finally, we would like to integrate APIs (LA Metro, TripAdvisor) when the Networking module gets restored so that we can get actual real-time information on all of the stations (ex. time until next train, best local attraction, best local restaurant). If we can successfully complete all of those tasks, the next natural step would be to reach out to the Los Angeles County Metropolitan Transportation Authority and ask for some sort of partnership. We can offer them a really cool project based on real gathered data and made with clean homemade assets, and they can offer us a unique stamp of approval and a marketing platform.

Built With

Share this project:

Updates