Inspiration
In Greek mythology, Argus Panoptes was the "all-seeing" giant with a hundred eyes. In today’s hyper-competitive business landscape, intelligence is scattered across thousands of data points—from subtle pricing tweaks to specialized hiring surges. We wanted to build a modern "all-seeing" eye that helps businesses cut through the noise. Instead of manually tracking dozen of tabs, Argus automates the observation process, giving leaders a holistic, real-time view of their market landscape.
What it does
Argus is a real-time competitive intelligence platform. You type in a competitor's name — Argus dispatches three specialized AI agents simultaneously:
- Job Postings Agent — scans their careers page to infer where they're investing (new ML hires = AI push, 10 open sales roles = aggressive expansion)
- Pricing Monitor — diffs their current pricing page against a cached snapshot to catch tier
changes, feature moves, or quiet price hikes - Changelog & Social Agent — pulls product releases, RSS feeds, GitHub activity, and Forum's
attention index to measure whether their cultural mindshare is rising or falling
An orchestrator then synthesizes all three into a single executive briefing: risk level, key
signals ranked by strength, and recommended actions — streamed live to your screen as each agent
reports in.
How we built it
Backend: Python + FastAPI with three async agents running in parallel via asyncio.gather. Each
agent follows a strict AgentResult contract (Pydantic models), so the orchestrator can reason
across all three outputs uniformly. Claude Sonnet powers both the individual agent analysis and the
final synthesis. Server-Sent Events (SSE) stream status updates to the frontend in real time.
Frontend: React 18 + Vite with a custom useStreamingBriefing hook that consumes the SSE stream. UI built with shadcn/ui components and Tailwind CSS — dark mode by default, signal-strength badges (high / medium / low) for every finding.
Forum integration: The social agent pulls Forum's attention index — the first regulated exchange to trade on cultural attention — to quantify whether a competitor's mindshare is trending up or down, not just what they shipped.
Challenges we ran into
Social Data Scaling: We originally intended for the Changelog & Social Agent to scrape Reddit to gauge real-time sentiment. However, we quickly ran into recently implemented scraping limits and API restrictions that threatened the tool's reliability.
The Pivot: We solved this by pivoting to a more robust monitoring strategy, utilizing Google RSS feeds and the GitHub releases API to track high-signal updates without the volatility of social media scraping.
Dynamic Web Structures: Pricing pages and career boards are highly dynamic, requiring us to build robust parsing logic to handle various HTML structures without breaking the pipeline.
Orchestration Logic: Ensuring the Synthesizer produced a coherent briefing when one or more agents returned sparse data required significant prompt engineering.
Coordinating a shared data contract across 4 people
Accomplishments that we're proud of
- Three agents running in true parallel with live streaming results — the "agents lighting up one
by one" effect is genuinely satisfying to watch
- A clean, enforced data contract that let four people build independently without merge conflicts
- The Forum integration turning an abstract "attention index" number into an intelligible
competitive signal ("mindshare up 40% — likely tied to their Series B") - Fallback data so thorough that the demo works even if every live scraper fails simultaneously
What we learned
This project taught us the intricacies of Multi-Agent Orchestration. We learned that giving AI agents specialized, narrow "roles" results in much higher accuracy than asking a single general-purpose agent to "find everything." We also gained deep experience in reconciling real-time web data with structured LLM outputs.
What's next for Argus
The next phase for Argus involves moving from observation to prediction. We plan to implement Historical Trend Mapping—visualizing how hiring surges from months ago correlate with pricing changes today—and Automated Alerts that flag high-strength signals, such as a competitor hiring their first "Head of AI," the moment it happens.
Since you're also targeting the Nozomio Labs track, would you like to add a section specifically highlighting your data observability or the reliability of the agents' "fallback" logic?
- Forum deeper integration — use attention index trends to weight signal strength dynamically (a
pricing change matters more when attention is already spiking)
Log in or sign up for Devpost to join the conversation.