Inspiration
We were inspired to take on this challenge both for personal reasons and to support HudsonAlpha's initiatives to create a more robust and sustainable agriculture system. One of our team members grew up on his family's farm and has witnessed firsthand the hard work and resilience that goes into farming and the devastating effects that pests, disease, and natural disaster can have.
Prior to the competition, we researched HudsonAlpha's contribution to sustainable agriculture and were further inspired after reading about HudsonAlpha's support of AgTech startups. We noticed that multiple AgTech Accelerator participants were addressing food waste and believe that our product fits with that theme while still being differentiated in its use cases.
What it does
Whether you are an agriculturalist or gardener worried that there may be a disease outbreak in your crop or a government entity wanting to ensure a healthy harvest, our app is for you! Farmers can snap a picture or upload an image of a leaf that they think may be diseased to our app and get real time classification and analysis on the health of their plant. This information is then geocoded and uploaded to our dataset which creates visualization and time series analysis of disease spread.
Additionally, we have added a forum future to engage our members in a community-like discussion where they can ask questions and get answers from peers and experts.
How we built it
YOLO is a real-time object detection model. For our project, we used YOLOv11, which has 2.3 million parameters, and trained it on a custom dataset containing images of plants exhibiting various bacterial infections and mineral deficiencies. We fine-tuned key hyperparameters—such as learning rate, weight decay, and regularization parameters—and experimented with several optimizers to optimize performance. Below are the results we achieved.
We built the front end as a portable Unity app, integrating the Voltstro Unity web browser plug-in to seamlessly run Tableau for data visualization, Mob Sakai UI Soft Mask for smooth and natural-looking UI masking, TriInspector for enhanced Unity Inspector customization and better component management, and Unity Simple Browser by Yasirkula to enable lightweight in-app web browsing and external content access.
REMBG was used to remove backgrounds from leaf images prior to analysis.
Tableau was used to generate geographic visualizations from the data collected by the app, allowing users to analyze and interpret spatial trends effectively. This was embedded into our apps map feature.
Challenges we ran into
Previously, we had the capability to train a machine learning model, save its trained parameters, and evaluate it on new data. However, we had not deployed a fully integrated solution with the Unity engine. The most significant challenge we encountered was integrating the trained model with the user interface design.
Accomplishments that we're proud of
- Achieving a precision confidence curve of 0.95 on our model
- Creating a dynamic front end that utilizes Unity's 3D rendering capabilities as well as its UI engine
- Completing most of our work on time and not having to rush to finish
- Implementing a balanced approach to tackling the prompt that utilized each team members' strengths
What we learned
- Intergrated a web engine inside Unity to render Tableau data in real time
- Creating buffers in Tableau to simulate a radius and measuring distances between points on a map
- How to use web sockets for to seamlessly connect the backend we developed with the frontend
What's next for AgriVision
After conducting further testing and ensuring that we can transmit image data from our app to Tableau—an essential step before collecting data through the app—we would like to start gathering feedback from our targeted end users. Additionally, we aim to enhance our reporting capabilities by developing better reporting and summary metrics on the data. We would also like to incorporate a fine-tuned LLM to generate more in-depth and actionable feedback for users.
If we were to get funding, we would love to explore the idea of using drones to survey land and capture images of crops that show signs of microbial infection for farmers. This would help us collect data much more quickly which would also be more useful for our secondary target customers, county extensions and state and federal entities, that specialize in crop disease management and prevention.
Log in or sign up for Devpost to join the conversation.