Inspiration
As a student currently pursuing my studies in a rural village, Ila Orangun in Osun State, Nigeria. I’m constantly surrounded by farmlands and farmers. I see, firsthand, how many of them struggle with unexplained crop damage. Leaves dry up or change color, fruits rot before harvest, and most farmers don’t know why they just watch their crops die, and most depend on hear say or word of mouth advice resulting in massive losses and wasted effort. It hit me how much preventable damage happens simply because they lack accessible information or early diagnosis tools. Statistics show that Nigeria loses tens of thousands of hectares of farmland each year to pests and diseases, with smallholder farmers who make up over 80% of the agricultural workforce bearing the brunt. Many farmers mistake disease symptoms for normal plant changes, which leads to delayed or ineffective responses and further crop loss. This challenge affects both subsistence farmers trying to feed their families and commercial farmers relying on their harvest for livelihood. It’s a real problem impacting food security and economic stability in rural communities. When I brought this up to my teammates, we realized we all shared the same vision to use our tech skills to solve real, everyday problems around us. We wanted to build something that could actually work in local, low-resource settings. That’s how AgroScan was born: a lightweight, AI-powered tool that identifies plant diseases from leaf images, works offline or in low-bandwidth areas, and gives feedback in a format any farmer can understand.
Our mission was clear: bring practical machine learning into the hands of underserved farmers, one leaf at a time.
What it does
AgroScan detects plant diseases from images of crop leaves using a deep learning model. It displays the predicted disease and confidence score. The system is lightweight, and is used on WhatsApp and can be embedded into edge devices like Raspberry Pi and will be sent via WhatsApp or SMS for real-time alerts.
How we built it
We developed a convolutional neural network (CNN) trained on over 12,000 local crop leaf images spanning 13 disease classes, and we used Drop out, Maxpooling and Conv2D for the model architecture and achieved approximately 98% accuracy after 50 epochs. Emphasizing local data helped ensure the model’s relevance to regional agricultural conditions. To validate the performance under real world settings, we intend to conduct pilot testing that measures response time and resource usage.
Our backend is built with FastAPI, while Streamlit provides a clean and user-friendly interface for demo and testing. We initiated WhatsApp integration using Twilio Sandbox, and the backend was deployed on Railway to make it publicly accessible.
Challenges we ran into
- TensorFlow Compatibility & Deployment Bottlenecks During deployment, Python version 3.13 was used on render and this causes deployment issues because python 3.13 which does not support Tensorflow.
Mitigation Strategy: We pivoted to Railway for backend hosting using Python 3.12, and was able to generate a public API URL for end-to-end integaration.
Iterative Testing Limitations
Due to resource constraints, live user testing was limited. Nevertheless, we ran internal simulations and tracked usability bugs.
Iterative Testing Plan: As we expand, we'll collect feedback from actual farmers using the system. This data will guide model fine-tuning, improve UI/UX, and reduce fallback cases. We aim to track metrics like classification accuracy, response latency, and user satisfaction over time
Accomplishments We're Proud Of
High Accuracy with Limited Resources: We trained a custom image classification model on 13 plant disease classes using over 12,000 images. Despite working with limited compute power, we achieved around 98% accuracy after 50 training epochs.
Working End-to-End System: During local testing and demo sessions, both our backend (FastAPI) and frontend (Streamlit) apps worked smoothly. Users could upload plant images and get real-time disease predictions along with confidence scores.
- Twilio Integration for Messaging: We integrated Twilio’s SMS and WhatsApp sandbox tools. SMS notifications work as expected on registered numbers. WhatsApp messaging also worked as expected by sending or uploading the image on whatsapp and it will be able to give the diagnosis
What we learned
We deepened our understanding of deploying lightweight machine learning models in resource-constrained environments, particularly in rural Africa.
We gained hands-on experience deploying a FastAPI backend and Streamlit UI, ensuring smooth integration with our model pipeline.
We discovered the importance of providing confidence scores alongside predictions to help users make better-informed decisions.
We explored WhatsApp integration via Twilio and learned how to simulate chatbot behavior that could scale in real-world farming use cases.
We realized how limited our exposure was to deployment before the challenge. Most of our previous work ended at model training, but through this process, we self-taught critical MLOps concepts using platforms like YouTube, docs and AI tools.
Overall, we saw firsthand how working within constraints can drive meaningful innovation and learning.
What's next for AgroScan
** Integrate LLM-powered treatment recommendations**
We're planning to connect AgroScan’s disease predictions with a multilingual treatment engine powered by open-source LLMs like DeepSeek, Mistral, or Flan-T5. This will enable personalized, crop-specific advice in major African languages.
** Deploy on solar-powered Raspberry Pi edge devices**
To bring AgroScan closer to the farm, we plan to install it on solar-powered Raspberry Pi devices equipped with camera modules. These offline-first edge devices will detect diseases directly in the field and notify farmers in real time.
Launch an offline-first mobile app
We're building a lightweight Android app that can function offline and sync data once connectivity is available which will be perfect for farmers in low-bandwidth rural areas.
** Expand dataset and retrain the model**
We aim to gather localized plant image data, retrain AgroScan to recognize more diseases across different crops, and improve its overall accuracy in African contexts.
Log in or sign up for Devpost to join the conversation.