Project Story
Inspiration
LivestockAI was inspired by a real operational problem that affects me and the other over 45 million livestock farmers in Nigeria and Africa. Livestock farmers often have to make urgent decisions with incomplete information, delayed expert support, and unreliable internet access. A disease outbreak, heat stress event, or feeding problem can turn into major losses before the farmer fully understands what is happening.
I did not want to build a generic AI chatbot for agriculture. I wanted to build something closer to a livestock operating system: a product that helps farmers monitor farm performance, manage daily operations, detect issues earlier, and make better decisions using AI that is grounded in the real state of the farm.
That is why I built LivestockAI to be offline-first, batch-centric, and centered on real farm workflows. Amazon Nova was especially compelling because it gave me an inexpensive multimodal frontier model foundation for reasoning, retrieval, image understanding, and voice-ready interaction.
What it does
LivestockAI is an offline-first AI operating system for livestock farms.
It helps farmers and operators manage farms, batches, mortality, feed, health checks, tasks, sales, expenses, and field operations in one place. On top of that operational layer, it adds AI systems that can monitor farm conditions, predict situations, learn and adapt, explain trends, analyze livestock images, simulate what-if decisions, and support hands-free interaction.
Its core AI experiences include:
- Farm Sentinel for anomaly detection, monitoring, and next-step guidance
- a multimodal assistant for chat, planning transparency, and grounded answers
- vision workflows for image/video based health reasoning
- Farm Optimizer for what-if scenario analysis
- voice-ready interaction powered by Amazon Nova Sonic
The result is a system that does not just answer questions. It helps farmers operate.
How we built it
I built LivestockAI as a full-stack TypeScript platform using TanStack Start, React 19, PostgreSQL on Neon, and Kysely for type-safe data access. It is designed for Cloudflare Workers and built with an offline-first architecture using IndexedDB and sync-on-reconnect behavior so key workflows remain usable in the field.
For AI, I built a unified runtime on Amazon Nova through AWS Bedrock. I use Nova 2 Lite for reasoning and agent workflows, Nova 2 Multimodal Embeddings for memory and retrieval, and Nova 2 Sonic for voice-ready experiences. Instead of treating AI as a separate demo layer, I integrated it directly into farm workflows like Sentinel investigations, assistant conversations, image analysis, and optimization.
I also built governance into the runtime itself. Sensitive write actions can require approval. AI usage can be monitored through admin runtime views. Credit usage is trackable. Kill switches and operational controls are available. That mattered to me because AI for real farm operations has to be controllable, not just impressive.
Challenges we ran into
One of the biggest challenges was making the AI powerful without making it reckless. In a livestock setting, bad advice or uncontrolled actions can have real cost. I had to design approval-aware actions, grounded responses, and operational controls so the system could be trusted.
Another major challenge was offline-first architecture. Most AI products assume stable connectivity, but many real farm environments do not have that. I had to think carefully about local persistence, sync behavior, and how to make key workflows usable even when the network is weak or unavailable.
I also had to avoid building a disconnected “AI feature demo.” The harder and more valuable problem was integrating AI into real farm operations so it works with batches, farm records, tasks, history, and admin controls instead of floating above them.
Accomplishments that we're proud of
I am proud that LivestockAI is a real product surface, not just a prompt wrapper. It combines operational farm software with an AI runtime that is grounded, multimodal, and built for real field conditions.
I am especially proud of the product shape itself: Sentinel for monitoring, assistant planning transparency, approval-based actions, image reasoning, what-if optimization, offline workflows, and admin governance for runtime health, credit monitoring, and kill switches.
I am also proud that the system reflects the realities of the users it is meant to serve. It is designed for the field, for operational pressure, and for environments where reliability matters as much as intelligence.
What we learned
I learned that useful agricultural AI is not mainly about model output quality in isolation. It is about context, trust, workflow fit, and operational control.
I also learned that frontier models become much more valuable when they are connected to real system state. Amazon Nova became most useful in LivestockAI when it was reasoning over actual farm context, multimodal inputs, memory, and governed tools, not when it was treated like a standalone chatbot.
Most of all, I learned that offline-first design changes everything. If you want to build for real farmers, connectivity cannot be an afterthought.
What's next for LIVESTOCK AI
Next, I want to deepen LivestockAI as a daily intelligence layer for livestock operations.
That includes improving live Sentinel investigations, expanding Optimizer recommendations, strengthening multimodal health workflows, and making voice interaction more natural for field use. I also want to expand multilingual support, deepen extension-worker collaboration, and continue improving governance so the AI remains safe and practical in production.
My long-term vision is to make LivestockAI a trusted operating system for livestock farms: practical AI for real decisions, real farms, and real field conditions.
Built With
- amazon-web-services
- bedrock
- kysely
- langchain
- nova
- pgvector
- postgresql
- react
- strands
- tanstack
- workers
Log in or sign up for Devpost to join the conversation.