AI-Powered Dynamic Ticket Pricing

Inspiration

As a ticket reseller, I constantly struggled with the age-old question: "Should I hold, drop my price, or sell now?" The secondary ticket market is incredibly volatileβ€”prices swing wildly based on time to event, seat tier, and market saturation. Missing the optimal selling window could mean hundreds of dollars lost per ticket.

I wanted to build a tool that combines real-time market intelligence with AI-powered decision-making to take the guesswork out of ticket arbitrage.

What it does

AI-Powered Dynamic Ticket Pricing is a real-time recommendation engine that:

  • πŸ” Scrapes live market floor prices from StubHub using automated data collection
  • πŸ“Š Streams data through Confluent Kafka for real-time processing
  • 🧠 Analyzes pricing strategy using Google's Gemini AI (gemini-2.5-flash)
  • πŸ’‘ Recommends actions (HOLD/ADJUST/LIQUIDATE) based on:
    • Seat tier elasticity (VIP vs. Upper Bowl)
    • Market phase (Hype/Dip/Burn periods)
    • Days until event
    • Current vs. market floor pricing
  • πŸ“ˆ Visualizes trends with delta tracking in a Streamlit dashboard

The AI acts as your personal arbitrage advisor, telling you exactly when to hold for max profit or when to liquidate before prices crash.

How we built it

Tech Stack:

  • Frontend: Streamlit for the interactive dashboard
  • Backend: Python with three core modules
    • scraper.py - Web scraper with anti-ban logic and tier categorization
    • analyzer.py - Gemini AI integration with custom system instructions
    • app.py - Real-time dashboard with Kafka consumer
  • Data Pipeline: Confluent Kafka for real-time message streaming
  • AI Model: Google Vertex AI (Gemini 2.5 Flash) with thinking tokens for strategic reasoning
  • Infrastructure: Google Cloud Platform for AI services

Key Engineering Decisions:

  1. Stateless scraper with randomized delays (10-60 min) to avoid rate limiting
  2. Kafka streaming to decouple data collection from analysis
  3. Session state management in Streamlit for delta tracking
  4. Max output tokens set to 2048 to handle Gemini's thinking tokens (~1000 tokens) plus response
  5. Safety settings disabled to prevent AI response truncation

Challenges we ran into

  1. Gemini Response Truncation: Initially, AI responses were getting cut off mid-sentence. Discovered that Gemini 2.5's "thinking tokens" (internal reasoning) consumed most of the token budget. Solution: Increased max_output_tokens from 200 β†’ 2048.

  2. JSON Parsing Errors: Gemini wrapped responses in markdown code blocks inconsistently. Built a robust parser to handle json, plain JSON, and markdown-wrapped formats.

  3. Python Version Compatibility: Hit deprecation warnings with Python 3.9. Upgraded to Python 3.13 to future-proof the project.

  4. Kafka Consumer Timeout: The app exceeded Kafka's 5-minute poll interval when idle. This was acceptable for our use case as reconnection is automatic.

  5. StubHub Cookie Expiration: Authentication cookies expire after ~2 hours, requiring manual refresh. This limits deployment options for a production system.

Accomplishments that we're proud of

βœ… Built a functional AI advisor that actually provides actionable, data-driven recommendations

βœ… Real-time data pipeline - Scraper β†’ Kafka β†’ Streamlit β†’ Gemini all working seamlessly

βœ… Smart tier-based logic - VIP tickets get different strategies than Upper Bowl seats

βœ… Clean UI/UX - Color-coded recommendation cards (green/yellow/red) with clear delta tracking

βœ… Production-ready infrastructure - Using enterprise tools (Confluent Cloud, Vertex AI, GCP)

βœ… Ethical implementation - Added legal disclaimers and educational-only framing

What we learned

Technical:

  • How to work with Gemini's thinking tokens and generation configs
  • Kafka streaming architecture for real-time data pipelines
  • Streamlit's session state management for stateful apps
  • Web scraping best practices (rate limiting, anti-ban patterns)
  • API authentication patterns (GCP ADC, Confluent SASL)

Business:

  • Market dynamics of ticket resale (elasticity, FOMO pricing, liquidation timing)
  • How AI can democratize knowledge that was previously expert-only
  • The challenges of building tools in legally gray areas

What's next for AI Powered Dynamic Ticket Pricing

Short-term improvements:

  1. Historical tracking - Store price history in a database to show price trends over time
  2. Multi-event support - Track multiple events simultaneously
  3. Alerts/notifications - Push notifications when action is recommended
  4. Backtesting - Test strategies against historical data to validate AI recommendations

Long-term vision:

  1. Official API integration - Partner with ticket platforms for legitimate data access
  2. Mobile app - iOS/Android app for on-the-go price monitoring
  3. Predictive analytics - Train ML models on historical data to predict optimal sell windows
  4. Portfolio optimization - Manage multiple ticket holdings with diversification strategies
  5. Market maker - Automated buying/selling based on AI recommendations (with proper legal framework)

Ethical path forward:

  • Transition from web scraping to official API partnerships
  • Focus on empowering individual resellers vs. large-scale bots
  • Add educational content about market dynamics and pricing strategies

Built With

Share this project:

Updates