Inspiration
Tracking my own portfolio and staying on top of markets is weirdly fragmented: price + fundamentals in one place, news somewhere else, and SEC filings in a totally different workflow. The most important information (filings + real catalysts) is also the least convenient to digest. We wanted a single, fast way to go from “ticker” → “what’s happening, why it matters, and how do I explore it in Tableau / Slack.”
What it does
Finny is a stock intelligence layer that turns a ticker into an easy-to-digest snapshot you can use instantly in Tableau or Slack.
Given a ticker, it returns:
- Market snapshot (price, change, key metrics)
- Fundamentals (core company stats)
- Latest news (normalized + deduped)
- Recent SEC filings (10-K / 10-Q / 8-K metadata)
- Filing summaries + key-change diffs (so you don’t read 200 pages)
- Events timeline combining filings, earnings, and news spikes
On top of the API, we integrated:
- Tableau Pulse API to surface proactive metric updates/insights
- Slack commands so users can query tickers and get answers fast
- Tableau Cloud + Tableau agent to let users ask questions in natural language and get guided insights backed by the unified dataset
How we built it
- Built a FastAPI service with a single
GET /stock/{ticker}endpoint that orchestrates all sources and returns a unifiedStockResponse. - Pulled market snapshot + fundamentals using
yfinance. - Fetched Google Finance news (via Browser Use), then normalized + deduped articles across title/URL similarity.
- Queried SEC endpoints to map tickers → CIKs, fetch recent filings (10-K/10-Q/8-K), and assemble filing metadata with rate limiting + graceful fallbacks.
- Parsed filings with sec-edgar-toolkit when possible; otherwise fell back to raw HTML/iXBRL extraction, cleaned/truncated sections, and sent selected parts to the agent for:
- concise summaries
- “what changed” diffs vs prior filings
- Built an events timeline by combining earnings dates, 8-Ks, and news spikes.
- Returned everything as typed Pydantic models with env-driven config so it’s easy to plug into BI tools.
- Wired up Tableau Pulse API + Tableau Cloud so insights can be pushed and explored, and added Slack commands for quick access.
Challenges we ran into
- Sync + async orchestration: mixing
yfinance(sync) withhttpx(async) required careful boundaries to avoid blocking the API. - SEC rate limits + inconsistency: throttling + retries were needed, and some tickers/filings required graceful handling when metadata was missing.
- Filing format variance: filings come as HTML, iXBRL, and inconsistent item headers—so we implemented multiple extraction paths and fallbacks.
- LLM reliability: model output isn’t always strict JSON, so we added robust JSON extraction + safe defaults to prevent pipeline breakage.
- Context limits: long filings forced us to select/truncate sections intelligently without losing key items.
- News noise: articles can be duplicated or slightly rewritten across sources, so normalization + dedupe logic was essential.
Accomplishments that we’re proud of
- Shipped a single endpoint that fuses market data, news, filings, summaries, and timelines into one clean response that’s ready for Tableau.
- Built a resilient SEC pipeline with toolkit parsing + multiple fallbacks that works across real-world filing formats.
- Added Gemini-powered summaries and key-change diffs to make dense filings actually usable in dashboards.
- Produced clean, normalized, deduped news so insights don’t get buried in spam.
- Integrated Tableau Pulse + Slack + Tableau Cloud agent, making the same intelligence accessible in multiple “work surfaces.”
What we learned
- SEC data is inconsistent enough that one parser is never enough—layered extraction is mandatory.
- LLMs need strict schemas + defensive parsing if you want production-grade reliability.
- Mixing async I/O with sync libraries requires clear boundaries to avoid performance traps.
- “Useful for Tableau” usually means normalized tables and stable fields, not raw blobs of JSON.
What’s next for Finny
- Add caching + background refresh so repeated ticker requests are instant.
- Expand filing coverage (e.g., 6-K, 13F) and add more event types to the timeline.
- Improve numeric accuracy by pulling standardized XBRL facts and exposing them as first-class metrics.
- Add tests around SEC parsing + LLM JSON extraction to make iteration safer.
- Deeper Tableau experiences: richer Pulse insights, guided “what changed?” narratives, and more agent-driven exploration.
Log in or sign up for Devpost to join the conversation.