The CSU Fresno campus contains a lot of wonderful trees. The existing tree walk route was decades old, many trees were relocated or replaced and the information was in a PDF format. Students redesigned a new tree walk route which focuses on drought tolerant, native, and indigenous trees (still work in progress). The Tree Walk Guide project submitted here aims to create interactive tools to support the redesigned walk the most modern way possible.

What it does

The Tree Walk Guide is an augmented reality based mobile application helper. With the help of Geospatial API the visitors can locate the target trees along the route, easily access information about the walk stops. Even more information is accessible on a companion website which leverages 2D Google Maps API and contains extra data, links and 360 photos of the stops.

The guide application also offers gamification: the user unlocks Game services achievements when the walk stop pin is clicked. On top of that anyone can gather extra points by virtually watering surrounding trees at the stops: clicking the shower icon the trees on of the current camera snapshot are identified and clicking on any triggers a virtual watering along with extra experience points earned.

I hope that the on one hand the information cards and on the other hand the gamification can capture the attention of all types of users and demographics. Accessibility is top priority for the project so both the companion website and the mobile application is bi-lingual (English and Spanish). The augmented reality app is also able to voice narrate messages and information if the feature is turned on.

How we built it

  • The augmented reality applications most important pillars are the ARCore and the Goespatial API. That allows the placement of the tree walk route stops by GPS coordinates into the real world space. Each stop also has a geo-fenced rectangular area marked by four posts where the user can perform the virtual watering.
  • Semantics API helps to identify potential trees the player can get experience points for by virtual watering. This is an EAP feature and only available on Android right now, so the project currently is Android only.
  • The gamification is based on the Google Play Game Services API.
  • The application dynamically downloads the current walk route from the companion website. Therefore a new app release won't be needed if the route changes: the website meta-data can be modified and the mobile app should pick that up. The mobile application technically is a Jekyll (Ruby on Rails) blog and the meta data is human readable: the non localizable portion is in yaml and the localizable texts are in markup front-end-matter (yaml style) or markup / light HTML.

Challenges we ran into

  • The ARCore engine was not able to handle the creation of all the anchors (even if they are not visible), this would be 24 map pins and 24 * 4 posts, so 120 Earth anchors. Therefore I refactored the code to only keep the current target stop and the anchors and the main anchor of the next stops created at a time.
  • I had trouble developing a shader for the Semantic API where I'd keep only the "tree" semantic pixels as a stencil and blend the camera texture in a way that it'd tint the tree areas lightly with a color. I completely redesigned the watering workflow, and will revisit the shader.
  • Since the walk is specific to the CSU Fresno campus I needed to develop a local debug mode.

Accomplishments that we're proud of

  • Despite how hard was it to

What we learned

  • Geospatial API and Semantics API makes features possible which were hard (image semantic analysis) or impossible (Geospatial Terrain Anchors) before
  • First year experience students will use the website and the app later this semester (a few already contributed issues to the website repository). I'm very excited about the feedback I'll get and the future of the project
  • There's still a ton of work ahead of us to enhance, develop, and refactor the source

What's next for Tree Walk Guide

  • Integrate fun facts about the walk route stops (the students are currently assembling the data), this will require data format / meta data change
  • Enhance the features I could not tackle with the minimum viable product, such as the Semantics shader or UX / usability issues
  • Add new languages, for example Hmong which is a prominent minority in the valley (latinos are the biggest though)
  • Gather feedback from the students and the professors to cater to their exact needs

Built With

Share this project:


posted an update

"How scalable is the application? Can it be used in other regions, or can it be used by more than one type of audience?": Since the whole experience is driven by the easily modifiable Jekyll blog data (markup format) and a yaml file it'll be easy to update the route as the students will have modifications. For the same reason it can be repurposed to any other location's tree walk (given that the location has street view). Actually sustainability students wondered if the my technology would be applicable to state or national parks such as the Kings Canyon / Giant Sequoia National Park, or Yosemite

Log in or sign up for Devpost to join the conversation.

posted an update

Please check out an updated demo which contains Semantics API aided tree watering: My original vision was to have a semantics overlay (taking only the tree area semantics) which highlights the trees and the user would look around, click on the tree areas (identified by Semantics API), and that would earn the extra experience points. I rehauled this partially due to performance to open a dialog and display the tree watering as a "tree scan" and no live taps on the AR scene is needed. The delay of the scan and the dialog would pose a natural rate limiting against overuse of the watering. I also adjusted the augmented objects to more reflect the companion website: tree object is now a red pine, and the posts are not down arrows but rather thinner simple poles.

Log in or sign up for Devpost to join the conversation.

posted an update

I needed to redesign the Semantics API related tree watering feature. I'm releasing an update to the Play Store ( or you can download release APK from here if you prefer: Things to know: the "local debug" feature will only visible to turn on in the settings if you have Developer Mode enabled on your Android device. This is to not interfere with regular usage. Besides that I'll post a short video snippet about the Semantics API feature.

Log in or sign up for Devpost to join the conversation.

posted an update

Note, that the occlusion is intentionally turned off, because in the campus environment the there are many building, the distances are large and buildings could easily mask out the next targets.

Log in or sign up for Devpost to join the conversation.

posted an update

I'll take student feedback because the UX will be improved. Maybe I won't even apply the geo fencing for the watering but it'll be free around all campus. The students will also plan modified route for kids and families, extra stops and more.

Log in or sign up for Devpost to join the conversation.