Inspiration
ChronoGuide AI started from a simple pain point: watch collecting is exciting, but the research process is fragmented and intimidating. New collectors often bounce between forums, resale platforms, YouTube reviews, and spreadsheets just to answer basic questions like “What fits my style?” or “Is this a fair price?”
We wanted to build a single, intelligent companion that combines collector taste, market context, and portfolio thinking in one product.
What it does
ChronoGuide AI is an AI-powered watch collector platform that helps users discover, evaluate, and manage timepieces.
It currently supports:
- Personalized watch discovery based on budget, style, movement, and collector persona
- AI-generated recommendations with clear reasoning plus pros and cons
- Semantic watch search over a local watch dataset using vector similarity
- Alternatives engine that groups substitutes by price tiers
- Price evaluation that classifies deals (for example: fair, overpriced, etc.)
- Portfolio-style collection tracking with estimated value, purchase price, and return visibility
- Dashboard summaries like total valuation, brand distribution, and style concentration
- Authenticated user flows and structured logging for observability and debugging
How we built it
We built ChronoGuide AI as a full-stack architecture with clear separation of concerns:
Frontend
- Next.js (App Router), React, TypeScript, Tailwind CSS
- Distinct pages for advisor, collection, dashboard, settings, login/register
- API-driven UI with loading/error states and portfolio metrics
Backend
- FastAPI with modular routes, services, agents, and middleware
- SQLAlchemy models for persistence and user-scoped collection data
- JWT-based auth and dependency-based request protection
- Structured request and AI interaction logging
AI layer
- LLM integration through provider abstraction
- Persona-conditioned recommendation agent using structured output
- Dataset retrieval + candidate filtering before LLM reasoning
- Embeddings/vector search pipeline for semantic retrieval
Quality and reliability
- Pytest coverage across auth, health, collection, advisor/search, pricing, settings, and vector logic
- Type checking and linting in both frontend/backend workflows
Challenges we ran into
Balancing deterministic retrieval with generative reasoning:
We needed recommendations to feel intelligent without hallucinating outside available inventory.Prompt and output consistency:
Persona voice is useful, but we still had to enforce predictable structure for UI rendering.Data modeling and valuation logic:
Mapping real-world watch attributes and price bands into clean, testable logic took iteration.Auth + async backend behavior:
Ensuring protected endpoints behaved correctly under async tests and user isolation scenarios.Frontend-backend contract stability:
As endpoints evolved, keeping payload formats and response expectations aligned required discipline.Observability:
AI features are hard to debug without logs, so robust request/AI logging became essential early.
Accomplishments that we're proud of
- Built an end-to-end AI product, not just a prompt demo
- Delivered a working advisor experience with persona-aware recommendations
- Implemented practical collector tools: discovery, alternatives, valuation, and portfolio tracking
- Shipped structured logging that makes demos and debugging much more trustworthy
- Added meaningful backend tests across core capabilities
- Created a strong visual identity and product personality instead of a generic dashboard clone
What we learned
- The best AI UX comes from grounding first, generation second
- Structured outputs are critical when AI responses feed production UI
- Personas are most useful when they change recommendation strategy, not just tone
- Logging and testability are force multipliers for AI systems
- Product value increases when “insight” and “action” live in the same workflow
- In domain-focused apps, curated data quality often matters more than model size alone
What's next for ChronoGuide AI
- Real-time market data integration from resale and retail sources
- Historical price charts and predictive valuation signals
- Smarter portfolio analytics (risk, diversification score, concentration alerts)
- Watchlist alerts for price drops and availability events
- Multi-model AI routing with cost/performance optimization
- Community features: shared collections, curator lists, and social proof signals
- Stronger onboarding personalization and collector profile memory
- Production hardening: background jobs, caching, monitoring, and deployment scaling
Built With
- digitalocean
- fastapi
- javascript
- python
- sqlalchemy
- supabase
- typescript
Log in or sign up for Devpost to join the conversation.