Bengaluru’s stray and companion-animal emergencies don’t fail because people don’t care — they fail because coordination is slow: someone finds an injured animal at 2 a.m., but they’re unsure how serious it is, who is actually on duty, and what to message so NGOs and vets can act. We built ResQ Bengaluru as a single-agent “rescue OS”: image-first triage, structured urgency, nearest real-world resources, and a WhatsApp-ready handoff — not another generic chatbot.
What we built ResQ Bengaluru is a hackathon MVP that treats emergencies like incidents:
Witness uploads a photo (drag-drop or camera). Vision-assisted assessment describes only what’s visible — posture, visible wounds/blood cues, roadside context — without pretending to be a vet. Severity / urgency is merged with simple rescue logic so “bleeding / road / alone puppy” patterns don’t get under-ranked. Elasticsearch does what it’s best at: semantic + geo-aware retrieval over a small but realistic Bengaluru dataset — shelters, emergency vets, NGO helplines, and short emergency playbooks (first response, CUPA handoff checklist). AWS Bedrock handles multimodal reasoning and summarisation; retrieval stays in Elastic so the architecture stays honest for judges: vector search isn’t “AI magic,” it’s indexed, queryable infrastructure. The UI outputs three things responders actually need: steps, ranked contacts, and a paste-ready WhatsApp alert plus a structured report block for desk intake. Why the stack is intentional
Elastic Cloud + dense_vector kNN + geo filtering → “nearest meaningful help,” not keyword roulette. Embeddings (Jina, with a local deterministic fallback) → demo resilience when API credits disappear mid-event. Bedrock → one place for vision + text structuring so we didn’t splinter into five LLM vendors. Next.js API routes → fastest path from upload to inference to search for a hackathon clock. What we learned (the real story) The breakthrough wasn’t “more AI.” It was splitting the problem: ask vision for observable facts, then ask text models for structured JSON. Demanding perfect JSON straight from pixels broke constantly; separating concerns made the demo stable and the narrative trustworthy. Geo was the second lesson: browser location is flaky on desktop and VPNs — so we designed for GPS when it works, manual pins when it doesn’t, and semantic retrieval either way.
Where it goes next Live NGO connectors, verified directory sync, responder dashboards, and multilingual alerts — but the core thesis stays: make the first 10 minutes after someone finds an animal as fast and clear as a good incident response tool.
Built With
- agent
- amazon
- api
- apis
- backend
- bedrock
- builder
- cloud
- dense
- ec2
- elasticsearch
- embeddings
- frontend
- generation
- infrastructure
- kibana
- ml
- multimodal
- node.js
- rest
- retrieval
- retrieval-augmented
- routes
- search
- semantic
- services
- vector
- vision
- web
Log in or sign up for Devpost to join the conversation.