Inspiration

It started with a WhatsApp message.

My relative who farms wheat in UP sent a photo of his crop to our family group chat asking "what's wrong with my plants?" Nobody knew. He ended up losing almost half that field because by the time he found someone who could help, it was too late.

That stuck with me. Here we are building AI tools for people who already have everything, while a farmer with a smartphone can't get a simple answer to "what's wrong with my crop?"

That's where CropSense AI started — not from a business idea or a hackathon brief, but from one unanswered WhatsApp message.


How the Idea Grew

At first I thought — this must already exist. Surely someone built this. But what I found were either expensive enterprise tools, research papers nobody could use, or apps that required stable internet and a decent phone.

The farmers I was thinking about use basic Android phones. Sometimes 2G. They speak Hindi or Marathi, not English. They don't have time to learn a complicated app. They just need to know: what is wrong with my crop and what do I buy to fix it?

So I decided to build the simplest possible version of that.


Building It

I'll be honest — I had never built a full stack app before this hackathon. I knew some React but Next.js was new to me. Supabase was new. Connecting an AI vision API was definitely new.

I started by just getting one thing to work — upload an image, get a diagnosis back. That took longer than I expected because finding a free vision AI API that actually works is surprisingly hard. Gemini kept giving quota errors. Finally got Groq working with Llama 4 Scout and that was the moment the whole thing felt real.

From there I kept asking — what does a farmer actually need? Not "disease detected." That's useless. They need: what product, how much, where to buy, how often.

So I kept refining the AI prompt until it gave answers like "Spray Mancozeb 75% WP at 2.5g per litre, available at your Krishi Seva Kendra for around ₹120." That's the kind of answer that actually helps someone.

Then I added Hindi toggle because most farmers I know are more comfortable in Hindi. Then text-to-speech because some of them struggle with reading. Every feature came from thinking about one real person trying to use this.


Challenges

The technical bugs were honestly the smaller problem. The bigger challenge was constantly asking myself — would a farmer in Bihar actually use this? Is this too complicated? Is the text too small on a cheap phone?

On the technical side:

  • Groq's model name changed and broke everything at 2am
  • Next.js routing got confusing when I accidentally put the page file inside the API folder
  • Hindi text-to-speech kept cutting off mid sentence (Chrome has a bug with long speech — fixed it with a pause/resume trick)
  • Getting the AI to return clean JSON without markdown formatting took way more prompt engineering than expected

What I Learned

I learned Next.js, Supabase, and API integration during this hackathon. But more than that, I learned that the best products come from real problems. Not from looking at what's trending or what sounds impressive — from one WhatsApp message from a farmer who needed help.

I also learned that making something simple is harder than making something complex. Every time I wanted to add a cool feature I had to ask — does this help the farmer or does this just look good in a demo?


What's Next

I want to actually get this in front of farmers. Not as a finished product — as a prototype to learn from. What do they actually find confusing? What diseases am I missing? What languages do they need?

The technical roadmap includes training a custom model on the PlantVillage dataset so it works offline, adding a WhatsApp bot for feature phones, and expanding to more Indian languages.

But honestly the next step is just finding ten farmers willing to try it and listening to what they say.

Built With

Share this project:

Updates