AI-Powered Dynamic Ticket Pricing
Inspiration
As a ticket reseller, I constantly struggled with the age-old question: "Should I hold, drop my price, or sell now?" The secondary ticket market is incredibly volatileβprices swing wildly based on time to event, seat tier, and market saturation. Missing the optimal selling window could mean hundreds of dollars lost per ticket.
I wanted to build a tool that combines real-time market intelligence with AI-powered decision-making to take the guesswork out of ticket arbitrage.
What it does
AI-Powered Dynamic Ticket Pricing is a real-time recommendation engine that:
- π Scrapes live market floor prices from StubHub using automated data collection
- π Streams data through Confluent Kafka for real-time processing
- π§ Analyzes pricing strategy using Google's Gemini AI (gemini-2.5-flash)
- π‘ Recommends actions (HOLD/ADJUST/LIQUIDATE) based on:
- Seat tier elasticity (VIP vs. Upper Bowl)
- Market phase (Hype/Dip/Burn periods)
- Days until event
- Current vs. market floor pricing
- π Visualizes trends with delta tracking in a Streamlit dashboard
The AI acts as your personal arbitrage advisor, telling you exactly when to hold for max profit or when to liquidate before prices crash.
How we built it
Tech Stack:
- Frontend: Streamlit for the interactive dashboard
- Backend: Python with three core modules
scraper.py- Web scraper with anti-ban logic and tier categorizationanalyzer.py- Gemini AI integration with custom system instructionsapp.py- Real-time dashboard with Kafka consumer
- Data Pipeline: Confluent Kafka for real-time message streaming
- AI Model: Google Vertex AI (Gemini 2.5 Flash) with thinking tokens for strategic reasoning
- Infrastructure: Google Cloud Platform for AI services
Key Engineering Decisions:
- Stateless scraper with randomized delays (10-60 min) to avoid rate limiting
- Kafka streaming to decouple data collection from analysis
- Session state management in Streamlit for delta tracking
- Max output tokens set to 2048 to handle Gemini's thinking tokens (~1000 tokens) plus response
- Safety settings disabled to prevent AI response truncation
Challenges we ran into
Gemini Response Truncation: Initially, AI responses were getting cut off mid-sentence. Discovered that Gemini 2.5's "thinking tokens" (internal reasoning) consumed most of the token budget. Solution: Increased
max_output_tokensfrom 200 β 2048.JSON Parsing Errors: Gemini wrapped responses in markdown code blocks inconsistently. Built a robust parser to handle
json, plain JSON, and markdown-wrapped formats.Python Version Compatibility: Hit deprecation warnings with Python 3.9. Upgraded to Python 3.13 to future-proof the project.
Kafka Consumer Timeout: The app exceeded Kafka's 5-minute poll interval when idle. This was acceptable for our use case as reconnection is automatic.
StubHub Cookie Expiration: Authentication cookies expire after ~2 hours, requiring manual refresh. This limits deployment options for a production system.
Accomplishments that we're proud of
β Built a functional AI advisor that actually provides actionable, data-driven recommendations
β Real-time data pipeline - Scraper β Kafka β Streamlit β Gemini all working seamlessly
β Smart tier-based logic - VIP tickets get different strategies than Upper Bowl seats
β Clean UI/UX - Color-coded recommendation cards (green/yellow/red) with clear delta tracking
β Production-ready infrastructure - Using enterprise tools (Confluent Cloud, Vertex AI, GCP)
β Ethical implementation - Added legal disclaimers and educational-only framing
What we learned
Technical:
- How to work with Gemini's thinking tokens and generation configs
- Kafka streaming architecture for real-time data pipelines
- Streamlit's session state management for stateful apps
- Web scraping best practices (rate limiting, anti-ban patterns)
- API authentication patterns (GCP ADC, Confluent SASL)
Business:
- Market dynamics of ticket resale (elasticity, FOMO pricing, liquidation timing)
- How AI can democratize knowledge that was previously expert-only
- The challenges of building tools in legally gray areas
What's next for AI Powered Dynamic Ticket Pricing
Short-term improvements:
- Historical tracking - Store price history in a database to show price trends over time
- Multi-event support - Track multiple events simultaneously
- Alerts/notifications - Push notifications when action is recommended
- Backtesting - Test strategies against historical data to validate AI recommendations
Long-term vision:
- Official API integration - Partner with ticket platforms for legitimate data access
- Mobile app - iOS/Android app for on-the-go price monitoring
- Predictive analytics - Train ML models on historical data to predict optimal sell windows
- Portfolio optimization - Manage multiple ticket holdings with diversification strategies
- Market maker - Automated buying/selling based on AI recommendations (with proper legal framework)
Ethical path forward:
- Transition from web scraping to official API partnerships
- Focus on empowering individual resellers vs. large-scale bots
- Add educational content about market dynamics and pricing strategies
Built With
- confluent
- gemini
- google-cloud
- python
- streamlit
- vertexai
Log in or sign up for Devpost to join the conversation.