Hawaiʻi’s nature is beautiful and powerful. Unfortunately, almost one third of native Hawaiian plants are endangered or threatened on some level. Motivated by the effort to protect native Hawaiʻi plants, we chose to take on the Plant Identification challenge, as proposed by KUPU and the Department of Land & Natural Resources (DLNR).

What we built

We built a mobile application that empowers citizen-scientists and official researchers to help protect the nature of the islands for the entire community. Users of the app can take part in this effort by capturing photos of plants and uploading them to our server. Our app then returns the top three genus predictions to the user. We allow the user to select which of the three results they believe the plant to be; enter a plant common, scientific, or Hawaiian name; or select "I don't know" if they're unsure. The State of Hawaiʻi and the DLNR will have access to the uploaded images and corresponding metadata, and so even if users are unsure about the classification, an expert can still classify the image at a later time. Due to the location information we collect (if permitted by the user), the State of Hawaiʻi and DLNR is better equipped to locate and protect native plants, as well as possibly eradicate invasive plants.

The app also educates the user about which plants are native, non-native, invasive, endangered, threatened, a candidate for listing, a species of concern, or not listed by the state. In addition, users are provided with the genus, species, common name, Hawaiian name, scientific name, description, story, and uses for the plant, along with photos of the plant that they or other users have taken. The story is a fascinating addition in that users can learn more about the Native Hawaiian culture and backstory of the plants.

How we built it

Behind the scenes, is powered by machine learning and AI. Our team utilizes modern deep learning techniques and large open-source datasets to achieve highly accurate classification on a wide range of flora found throughout the Hawaiian Islands.

To develop a neural network capable of classifying plants in images, we first had to start with data collection. We collected images from multiple sources, including, but not limited to: Wikimedia Commons, Starr Environmental, Flickr, Instagram, and Google Images, filtered by license. We focused on grouping images by plant genus, since Matt Keir informed us that even professionals may have difficulty in identifying species under non-flowering conditions. We used rvest, Selenium, and a multitude of Python libraries to scrape images. Our raw dataset contained many non-plant images, which we filtered manually to obtain a relatively clean final dataset.

After collecting images and building a baseline neural network model, our team then leveraged modern deep learning techniques such as data augmentation and transfer learning to improve the accuracy of our models. We investigated multiple state-of-the-art lightweight neural network architectures including MobileNet, MobileNet V2 and NASNet Mobile and found that MobileNet gave the best top-3 accuracy. Our final model gets a respectable 73.76% top-1 accuracy, 88.74% top-3 accuracy, and 92.51% top-5 accuracy.

The mobile application itself is built with Vue.js and the Quasar Framework on the frontend, powered by Node.js on the backend, where TensorFlow.js runs the machine learning model that had been converted into the TensorFlow.js Layers format from the original Python Keras model. User images and GPS coordinates are uploaded to an Amazon Web Services (AWS) S3 bucket and then recorded in an AWS Relational Database Service (RDS) PostgreSQL instance, while the Node.js server is deployed to an AWS EC2 instance. The database was seeded with plant information from Bishop Museum.

Challenges we ran into

According to the DLNR, Hawaiʻi has “approximately 1,400 vascular plant taxa [...] native to the State of Hawaiʻi, and nearly 90 percent of these are found nowhere else in the world.” Because of this, for the native and invasive plant species we were interested in, it was difficult trying to acquire many images per plant, let alone an equal or relatively balanced number of images for each plant. Early on, we decided to work with the images we were able to acquire, narrowing down to 42 plant species on which to train the machine learning model on. In hindsight, we could have further increased our reported accuracy by only including plants with more than ~50 images.

As for app development, the main challenge we ran into was that much of our functionality relied heavily on the use of the native device camera, which cannot be run on the browser nor the iOS simulator. We had to iterate development of the main multiprocess feature by deploying to the iOS device anytime we wanted to test the Plant Identification Results page, which was a slow process with a build time of around 2 minutes. We also ran into a few issues with installing tfjs-node directly on the EC2 server, only fixed to be stable down to the last hour! Lastly, we found that uploading raw images can be extremely slow. We fixed this by resizing before uploading, leading to much faster upload times and no detrimental effect to accuracy.

Accomplishments that we’re proud of

It cannot be understated how much work went into From the app framework to neural network construction, this project is an unprecedented undertaking by a small group on a tight deadline. The app is not the only plant identification app on the market. However, all the apps that we tested performed poorly on native Hawaiian plants. Thus, one of our main goals was obtaining good accuracy on native Hawaiian plants. We are happy to say that the app achieves 73.76% top-1 accuracy, 88.74% top-3 accuracy, and 92.51% top-5 accuracy on our dataset, while Pl@ntNet did not correctly identify any native Hawaiʻi plant genus that we tested it on. In addition to the latest machine learning techniques, our team leveraged cutting-edge reactive development tools and a highly scalable infrastructure to help futureproof deployment across systems. Creating’s client and server with the latest tools, while implementing a unique database and prediction system was only possible because of our team’s hard work and dedication to the mission of this challenge.

What we learned

We learned that, despite minimal features, building and deploying an image application that uses the camera and is powered by machine learning takes a lot of trial and error. Nevertheless, we are proud to have a finished product that we can use out in the field or during a hike on our beloved islands of Hawaiʻi.

Future enhancements

While’s main goals have been met, we believe that there is always room for expansion. Specifically, we believe the implementation of user accounts would be a useful feature, as a user history could aid in building the database and future predictions. We would also like to add more plants to the database, including plants not mentioned in the hackathon. Yet one of the problems with acquiring more data is filtering out non-plant images. In the future, we could again use machine learning to filter out non-plant images before ingesting the images into our database. The app currently requires Internet connection to upload to the plant database and make predictions, but we would like to look into setting up prediction capabilities fully within the app, without Internet. This would enable further use in the field, where Internet connection is not as reliable. We would also like to look into adding more Native Hawaiian-related information and stories connected to the plants.

Other possible features include: a map showing where all images were taken, the ability of a verified official or researcher to add new plants and identify unknown images, displaying information on when the plants flower and if they are currently flowering, displaying hiking information and plants available on the hike, sending alerts (“Hibiscus nearby!”) based on location, and a leaderboard and recognition system as incentives for users to continue to locate and photograph plants, perhaps awarding more points based on how endangered the plant is. The leaderboard and recognition system could be sponsored by local companies willing to donate weekly or monthly prizes for a good cause: helping to protect the natural environment of Hawaiʻi.


We would like to thank the sponsors and organizers of the Hawaiʻi Annual Code Challenge, the HACC challenge hosts, and the State of Hawaiʻi for making this hackathon possible. We are also thankful to Matt Keir and Nate Gyotoku for engaging us with insightful conversation and providing data sources. Furthermore, we are entirely grateful to Beth Kuch and Jason Sewell for their unending support via email and the Slack channels.

Built With

Share this project: