Inspiration

LA has some of the most complex real estate in the country — layered zoning codes, fire and earthquake hazard overlays, Transit-Oriented Communities density bonuses, historic preservation districts. The data to understand all of it is fully public. It just lives across a dozen different city and county portals with no unified access layer.

An acquisitions associate today opens ZIMAS, NavigateLA, LAHD, LADBS, and six more tabs, copy-pastes numbers into Excel, and writes a memo. That workflow takes 2–3 days. We wanted to build the tool we wish existed: type an address, get a decision-grade brief instantly.

What We Built

Click any point on the map (or type an address) — parcel boundaries snap into view and live city data is fetched in parallel: zoning, fire/fault/liquefaction/landslide hazards, permits, violations, 311 complaints, crime, census income, and fair market rent Gemini synthesizes a structured brief — AI interpretation, incentive programs, environmental thresholds, TOC density bonus scenarios Persona toggle — switch between Renter (habitability, neighborhood safety, area activity) and Buyer / Investor (FAR upside, development scenarios, incentive programs) and the panel adapts Map layer toggles, a chat overlay with full parcel context, report export, and Auth0 login with MongoDB-cached analyses An MCP server that connects AI coding agents (Devin, Claude, Cursor, Windsurf) to LA's public data — with provenance tracking on every result and a verify_claim tool that cross-checks agent claims against independent upstream data A Fetch.ai Agentverse agent discoverable from ASI:One — type any LA address and get back a full development brief

How We Built It

Frontend: Next.js 16, React 19, deck.gl + react-map-gl, shadcn/ui + Tailwind v4 Agent layer: Vercel AI SDK 6, Gemini 2.5 Pro for structured parcel analysis, Gemini 3.1 Flash Lite for the chat layer Data pipeline: All fetchers run in parallel with per-source timeouts — failures degrade gracefully, partial data still produces a useful brief Infrastructure: MongoDB Atlas for analysis cache, Auth0 for login, Vercel for deployment, pnpm workspaces + Turbo monorepo MCP server: @modelcontextprotocol/sdk, content-addressed cache, signed provenance envelopes, JSONL session journals, stdio + HTTP transports Agentverse agent: uagents Chat Protocol with Agentverse mailbox — thin Python proxy over the existing TypeScript backend Challenges Parcel matching in a dense city. LA County's identify endpoint returns every parcel within tolerance — often including sub-parcels at the same address. For a downtown high-rise, the naive first-match logic chose a surface parking lot instead of the actual building. We built a ranking algorithm that restricts to parcels matching the searched street number, then prefers the most-improved parcel by building area.

Zoning string parsing. LA zoning codes come with conditional prefixes ([Q], [T]) and overlay suffixes (-CDO, -CPIO, -CUGU). Height district suffixes carry entirely different FAR multipliers under LAMC §12.21.1 — a C2-1 and C2-4D look similar but have dramatically different development ceilings. All of this had to be stripped and parsed correctly before FAR lookup.

Three deliverables in 36 hours. Web app, MCP server, and Agentverse agent are independent surfaces. The right call was a strict shared-backend architecture — the Python agent is a thin proxy, not a reimplementation; the MCP server is focused on verification rather than duplicating the web app's fetchers.

Streaming + cache consistency. The MongoDB cache write happens in the AI SDK onFinish callback after the agent stream completes. Coordinating this with Auth0 user scoping and lazy connection handling required careful async ordering across the streaming boundary.

What We Learned

The data gap in LA real estate isn't availability — everything we use is public and free. The value is synthesis: pulling it together, normalizing edge cases, and presenting it in a way that fits a real professional's workflow Provenance isn't a nice-to-have for AI agents in professional domains — it's a precondition for trust. Most MCP servers make hallucination easier by giving agents more tools to confidently state wrong things; verify_claim flips that Respecting the existing TypeScript codebase kept the Python Agentverse agent small, correct, and finished on time

What's Next

Multi-city expansion Fetch.ai Payment Protocol for paid API access from ASI:One Comparable sales layer — current market $/sqft, not just assessor values Mobile layout for field use during site walks

Built With

  • agentverse
  • asi:one
  • fetch.ai
  • figma
  • hud
  • la-county-assessor
  • la-geohub
  • ladbs
  • lahd
  • lapd
  • mcp
  • model-context-protocol
  • mongodb-atlas
  • myla311
  • uagents
  • us-census
  • vultr
  • zimas
Share this project:

Updates