Inspiration
Quarry grew out of a simple frustration: it’s scary to share high-value datasets when buyers can’t preview them safely. We wanted a marketplace where providers stay in control, buyers see exactly what they’re paying for, and every claim about quality or freshness can be verified on-chain.
What it does
Quarry is a schema-only data marketplace paired with an AI “data agent.” Providers upload CSV/JSON/SQL files; the backend converts them to Parquet, analyzes quality, and publishes human-readable column semantics without exposing raw rows. Buyers attach those datasets in the UI, ask questions in natural language, and the agent plans SQL queries. Before any data runs, Quarry quotes the per-row SOL cost using the x402 micropayment flow. After a wallet payment clears, the backend executes the query in DuckDB, streams the purchased slice, and issues on-chain usage receipts. Every dataset’s QA report is pinned to IPFS and turned into Solana attestations so buyers can verify freshness, PII risk, and publisher credibility.
How we built it
We paired a FastAPI + DuckDB backend for ingesting CSV/JSON/SQL files and converting them to Parquet with a Next.js front end that lets you browse “schema-only” previews. An AI agent uses OpenAI tools to plan queries, but any row-level access pauses behind an x402 Solana micropayment flow. Successful payments trigger DuckDB queries, stream the rows, and mint usage-receipt attestations, all tied together through IPFS reports and Solana Attestation Service proofs.
Challenges we ran into
- Coordinating the payment loop (OpenAI tool call → x402 quote → Solana transfer → data delivery) without race conditions.
- Keeping DuckDB performant while appending new parquet chunks on the fly.
- Balancing UX: the agent needs to stream conversational responses but also render payment cards and CSV exports seamlessly.
- Running reputation jobs asynchronously so uploads stay fast while QA + attestation work happens in the background.
Accomplishments that we're proud of
- Delivering schema-only previews so providers never leak raw rows yet buyers still understand the value.
- Wiring the x402 Solana payment loop end-to-end: AI agent pauses, quotes a per-row cost, collects SOL, and only then streams the purchased slice.
- Shipping verifiable trust: automated QA moves to IPFS, Solana attestations prove quality/freshness, and reviews require on-chain usage receipts.
- Building a wallet-native UX where Solflare signers can pay, attach datasets, and export CSV slices without leaving the app.
What we learned
- UX guardrails (price previews, wallet prompts, CSV export) make crypto payments feel approachable.
- DuckDB + Parquet is a sweet spot for fast, appendable analytics without managing heavy infra.
- On-chain attestations stop being “extra work” once you treat them as first-class product features.
- Coordinating AI streaming responses with payment gating requires obsessive state management on both frontend and backend.
What's next for Quarry
- Browser-based parquet readers so small previews can render client-side with zero backend round trips.
- Tiered publisher verification (KYC, jurisdiction badges) tied to SAS so enterprise buyers know who they’re dealing with.
- Multi-slice baskets where one payment unlocks several coordinated queries across different datasets.
- Auto-refreshing reputation scores that rerun QA when publishers push new parquet versions.
Built With
- ipfs
- python
- solana
- typescript
- x402

Log in or sign up for Devpost to join the conversation.