The main screen.
An email form to suggest new plants.
What it looks like after a user submits an image.
The warning screen.
Part of the information page.
Another part of the information screen
Lamb's Quarter and Miner's Lettuce analysis
Morel and Rose Hip Analysis
Cattail and Dandelion Analysis
Ameranth and Elderberry Analysis
I was walking around in my Mom’s garden when I realized that a lot of nature goes to waste, simply because people don’t know what is and isn’t edible. If I helped with that, such as by creating an app that identified specific types of plants, then it could encourage others to explore the outside and try collecting their own food. Furthermore, it could be used to teach kids more about nature, encouraging them to think about protecting the environment.
What it does
There is a disclaimer each time you open the app, making sure you understand that the information is still incomplete and may not be fully reliable. Afterwards, you’re taken to the main screen, where you can take a picture by tapping a button. If your camera is not enabled, then tapping the camera button will ask you to do so.
Upon doing so, your camera app opens and you can take a picture. Once confirming it, the image is shown on your phone and normally*, the app would determine what type of plant you took a picture of; you would then get a short message and a tip to tap the analysis icon in the menu. In turn, this opens a scrollable view with the name of the plant and some fun trivia which can be used for children who want to go out and harvest their own plants or adults who want to cook what they found.
- This feature went unimplemented due to some complications with the Google AutoML Visions library. I expected to use TensorFlowLite for my model, but when the model compiled, I couldn't download it for whatever reason. I do have the model made, I just didn't have enough time to adapt to the sudden changes. As a result, some images in the gallery contain samples of what the model would do, as a sort of proof that this was the original concept and it was so close to making it into the final design.
So far, it can identify with at least 90% accuracy:
- Lamb’s Quarters
- Miner’s Lettuce
- Rose Hips
- Dead Nettles
There is also an email button if you want to submit your own plant. Just click it, put in a plant name, and then you'll be sent to an email form.
How I built it
I used Google AutoML to create my machine learning model and used Android Studio to create my app.
Challenges I ran into
It took me a lot longer than I expected to find all the images necessary and do all the research needed on my plants. Also, the bugs with AutoML caught me off guard, forcing me to adapt my presentation. If you'll look in my code (specifically PlantComparer), you'll notice I was trying to use TensorFlow Task in this app.
Accomplishments that I’m proud of
I’m proud of being able to find all the images to perform my machine learning test and do all the research necessary to give users an idea on what the plant does.
What I learned
I learned about how machine learning works, along with what bounties exist in nature that we ignore simply because we’re rarely encouraged to go outside and collect our own food.
What's next for Foratify
After I plan to include some more plants and fix the glaring bugs, I’m thinking about publishing it to the Google Play Store.
https://docs.google.com/document/d/13KIkg5v0Y6ZzS5Skg7qCQ2nbEfkK-UQmCy2jfpwttt8/edit (For images)
https://docs.google.com/document/d/1z7_d72GvxqhweOJ5EiGAALVpU4bNF9NmSboRQHamSCY/edit#heading=h.mmditquxut5s (For my research)
Log in or sign up for Devpost to join the conversation.