Inspiration

The purpose of Mea Kanu is to use technology to inspire the youth to mālama our native endangered species.

What it does

Our application solves one of the most difficult challenges in computing, identification of natural images. While challenging this ubiquitous problem has many practical applications across many disciplines. For example, accurate identification can lead to addressing botanical taxonomy gaps, identifying new species, and educating tourists and locals alike.

For this project we present a solution using a mobile application that is specific to plants here in the islands. We have assembled a Convolutional Neural Network with stacked/residual blocks and a preprocessing stage for multiscale analysis. The result is a highly discrimintive state-of-the-art deep learning approach to perform fine-grained categorization.

How we built it

The architecture of the application follows a client-server model. There is a python server that takes in POST requests from our application. Upon receiving an image payload the image is saved to the server. From there the image processed by breaking the image up into 224 x 224 pixel boxes and analyzed multiple times. The neural network then uses these segments to determine which plant is most accurately represented by the picture in question. The server then sends back a JSON string containing the ID number of which plants it the probabilities of each plant.

The front end of the app is written in Nativescript, this way the app could be transpiled into either iOS or Android applications. We chose to run the app exclusively in Android as the App store has a long development cycle outside the possible scope of this project.

Challenges we ran into

Each group member incurred different challenges on the project as there were many different aspects to this project from many different disciplines within computer science. For the architecture of the application many different infrastructures were proposed. First we discussed the idea of putting the code on Amazon web services using S3 buckets, Gateway API, and Lambda. Eventually this idea was scrapped as the neural network was very large, and rather costly to host over a period of time. After this transition an HTTP client server model was adopted with success.

Another major problem, as many groups faced, was the work-school-life-HACC balance. Which was not easy to maintain. Several members in our group work full time and have children, and the other half of our group are full time students. Balancing this with creating a technically heavy app provided to be the most difficult part of this process.

Accomplishments that we're proud of

We're excited at the outcome of our application. A fully functional machine learning phone application is not something that any of us thought we could pull of at the beginning of the HACC. It seemed to be an unattainable goal. Being able to accomplish this is huge for all of us. We are also happy to have something tangible on the Google Play Store, for many of us this is our first published application and having a product that we can show off to friends and family is an accomplishment that we can all be proud of.

What we learned

From a technical aspect we have learned a lot of different technologies. For those of us working on the server-client side we learned a great deal about the different aspects of Amazon Web Services, despite scrapping the direction. We also learned about using python to host a server and take in and respond to different HTTP requests. The programmers working on the front end learned about using Vue, nativescript, and sharpened their javascript skill set. For those of us that assisted in working on the machine learning side of the code we were able to learn about how to use computer vision, segmentation, and fine grain categorization to better classify images, and get through the challenges of natural image recognition.

From a less technical standpoint this project really pushed us in regards to time management, project management, and work delegation. The aforementioned work-life-school-HACC balance really pushed many of us to our limits in regards to what we could do in terms of contributions and sacrifices were made especially in regards to sleep. Overall we learned a great deal both technically and about how we all work as group members and on technical projects as a whole.

What's next for Mea Kanu

We have a lot of plans for the future of Mea Kanu. First and foremost we would like to expand our application to include as many species found in Hawaii as possible. The application currently only functions for 100 different plant species around the island, a small fraction of the total amount of biodiversity found in the island. We have already made efforts in this regard, as one of our group members is in talks with the Lyon Arboretum to expand our application to focus primarily on plants found within their botanical garden.

The future of the architecture for this application is to host it on the cloud to allow faster access for users anywhere in the world. We hope to move the app and server code onto Amazon Web Services in the future. We also hope to have this application put onto the Apple App Store so that we can spread this educational product with an even larger audience.

Built With

Share this project:

Updates