Inspiration

Insects represent 95% of the world's biodiversity and are essentially some of the strongest indicators of regional biodiversity, but they rarely if ever are considered when it comes to studies about the environmental health of a region. We wanted a way to use the democratization capabilities of technology to crowdsource insect viewings in a region to test the biodiversity and also raise awareness about biodiversity in the region.

What it does

The app has 3 components.

  1. Users can either take a photo of an insect or choose a photo in their gallery that they've taken recently and these insects will be identified and shown to the user, as well as stored in a database.

  2. Users can see the most frequently seen insects in their area.

  3. Users can see endangered species in their region.

How we built it

We built the front end with Flutter, and integrated it with various plugins to fit the needs of the various parts of our app.

We found a large dataset containing 492239 images of different species of insects, filtered out the 1045 most common species in the region and trained it via transfer learning with EfficientNetB0 via Tensorflow. This model was loaded via Tensorflow Lite and integrated directly with Flutter.

We also leveraged Flutter's geolocator packager and worked to segment GPS coordinates into local squares to determine regions of biodiversity.

The rest of the UI was built with the heavy assistance of Flutter's Gallery.

Challenges we ran into

Flutter has been changed recently into a type-safe and null-safe language, but many packages and indeed documentation has not really caught up. This led to frequent issues in the code between many different versions of Flutter. Additionally, because we were not familiar with Dart nor Flutter, we ran into many issues working with the front end.

Integrating machine learning models was difficult as well because we had to adjust various image sizes and preprocess them via Dart and Flutter (not designed to handle ML preprocessing) before handing it into the tensorflow lite model. As a result, there was a need to do heavy bytestream processing on images.

Accomplishments that we're proud of

None of our team members had any experience with frontend, and in fact one team member had only taken an introductory CS course and had never built anything before, but we managed to get a working app running with ML integration.

What we learned

We learned an entirely new language, Dart, and an entirely new framework, Flutter, alongside how to integrate machine learning models onto mobile platforms.

What's next for ShutterBug

We collected a larger overall dataset with over 490000 images, but that proved too difficult and time-consuming to train in the 24-hour constraints. With a longer time period, we could run the ML model on the entire dataset to get a stronger insect identification model. Additionally, we initially wanted to implement a biodiversity health indicator based on the number of insects, but that proved to be too much to integrate into our application, but we have the algorithm down, and would like to implement it. We could see this being helpful in integrating local communities with

Built With

Share this project:

Updates