Inspiration

I was inspired by the potential of leveraging data to provide actionable insights into competitive VALORANT play. Understanding player statistics, team dynamics, and tournament results is key for coaches, analysts, and team managers in making strategic decisions. Initially, I aimed to build a system that could process the rich datasets provided by Riot Games, offering in-depth performance analytics. When challenges arose with data processing, I pivoted to an alternative approach, using web scraping and Amazon Bedrock to create a more grounded Knowledge Base. This new direction allowed me to continue delivering valuable insights while overcoming the limitations of the initial data sources.

What it does

Valorant Scout is an intelligent Agent that helps users access detailed data about VALORANT players, teams, and tournaments using a natural language interface. By integrating a Knowledge Base built with Amazon Bedrock and data scraped from Valorant Esports Coverage, it allows users to ask questions such as:

  • "Which player has the best K/D ratio in the latest VCT tournament?"
  • "How did Team X perform in their recent matches?"
  • "Compare Player A and Player B's performance metrics."
  • "Who is the best player in North America right now?"
  • "Which team has the best defense rate?"
  • "How did Team X perform in the latest tournament?"

Valorant Scout provides detailed responses using the scraped statistics, offering insights like Average Combat Score (ACS), win rates, clutch success, and more. The platform is especially useful for team managers looking for scouting reports, performance comparisons, and strategic evaluations based on the latest data.

How we built it

Initial Approach - Data Processing and Extraction:

  • My original plan involved using data provided by Riot Games, focusing on game files with detailed player and team statistics. I stored these large JSON files in S3 and aimed to process them using AWS Glue and Lambda functions.

  • I faced challenges with the large file sizes (over 120MB each) and complexities of streaming and processing the data. Memory limitations and timeout issues in AWS Lambda, combined with the time constraints of the project, made it difficult to extract data in a timely manner.

Pivot to Knowledge Base Approach:

  • To overcome these challenges, I shifted focus to building a Knowledge Base using Amazon Bedrock, populated with data scraped from Valorant Esports Coverage—a leading source for VALORANT player and team statistics.

  • Using web scraping techniques, I gathered key metrics like ACS, K/D ratios, win/loss records, and tournament details from Valorant Esports Coverage. This data was structured into a format that the Bedrock Agent could interpret effectively.

Web Application & Deployment:

  • I developed a user-friendly interface using Python/Flask, allowing users to interact with the Bedrock-powered agent through a simple web interface.

  • The application was deployed using AWS Elastic Container Services (ECS), ensuring a scalable and resilient environment for user queries and responses.

Knowledge Base Integration:

  • I designed the Bedrock Agent to interpret user queries and reference the Knowledge Base for relevant stats, allowing it to answer complex questions with precision.

  • This integration enabled the agent to handle various scouting and analysis-related queries, providing insights that would be critical for team managers and analysts.

Challenges we ran into

  • Data Processing and Memory Constraints: My initial plan involved processing large JSON game data files from Riot Games using AWS Lambda and Glue. However, the file sizes (ranging from 120MB to 200MB) caused memory limitations and frequent timeouts. Managing the streaming and parsing of such large datasets in a serverless environment proved to be more complex than anticipated.

  • Switching to the Knowledge Base Approach: Pivoting to the Knowledge Base approach using Amazon Bedrock required a quick shift in strategy. While it allowed me to work around the data size limitations, it introduced new challenges in scraping data from Valorant Esports Coverage and ensuring the Knowledge Base was structured correctly for the Bedrock Agent to interpret.

  • Data Quality and Consistency: Ensuring consistency and accuracy in the scraped data was critical, as the insights provided by the Bedrock Agent depended heavily on this information. I faced challenges in handling missing data, varying formats, and ensuring that statistics were up-to-date, especially for ongoing tournaments.

  • Deployment and Scalability: Setting up the Python/Flask application and deploying it on AWS Elastic Container Services (ECS) required careful consideration of scalability and resource management. Managing dependencies, configuring the containerized environment, and ensuring a smooth user experience took more time and effort than expected.

  • Balancing Time Constraints: With time running short, I had to make tough decisions, like abandoning my initial approach and focusing on what I could realistically deliver within the hackathon timeframe. Prioritizing which features and functionalities would provide the most value under time pressure was a constant challenge.

  • Solo Development: Being the sole developer on this project meant that I had to manage every aspect of the solution—from data processing and scraping to building the web interface and deploying it on AWS ECS. While this allowed for a cohesive vision and consistent execution, it also meant that there was no room for error and I had to quickly switch between roles like a developer, data engineer, and DevOps. The lack of a team also limited my ability to brainstorm solutions or divide tasks during critical stages of the project.

Accomplishments that we're proud of

  • Adapting to challenges and finding an alternative solution that still meets the project's goals, despite the setbacks with the initial data processing approach.

  • Successfully creating a Knowledge Base with Amazon Bedrock that could respond to nuanced queries about player and team performance using data scraped from Valorant Esports Coverage.

  • Building and deploying a Python/Flask application on AWS ECS, enabling a smooth and user-friendly experience for interacting with the Bedrock Agent.

  • Demonstrating the power of combining web scraping with a knowledge-based AI system to deliver valuable insights, even when direct data access is limited.

  • Building an End-to-End Solution Alone: Despite being a solo developer, I was able to pivot strategies when needed, design the architecture, and bring the project to completion. This required learning new tools on the fly, managing complex AWS services, and problem-solving in real time—all under the tight deadlines of a hackathon. Turning what could have been a disadvantage into a demonstration of versatility and determination is something I am particularly proud of.

What we learned

  • Adapting to Data Challenges: I learned that sometimes the best solution is a pivot. The challenges of processing large game data files taught me to think creatively and explore alternative approaches like scraping data from Valorant Esports Coverage.

  • Leveraging Amazon Bedrock: Integrating Amazon Bedrock with a Knowledge Base provided valuable insights into the capabilities of generative AI for responding to specific, data-driven queries.

  • Building for Scalability: The process of deploying my application on AWS ECS taught me how to design for scalability and reliability, ensuring that the application can handle user queries efficiently.

  • Data Structuring for AI Models: Understanding how to structure and present data so that the AI model can easily interpret and reference it was key to the success of the Bedrock-based solution.

What's next for Valorant Scout

  • Deepening the Knowledge Base: Expanding the Knowledge Base to include more historical data and advanced metrics, allowing the agent to provide even richer insights.

  • Real-Time Data Integration: Adding real-time data scraping from Valorant Esports Coverage to ensure that the agent always has the latest information about ongoing tournaments and recent matches.

  • Advanced Comparison Features: Implementing more complex comparisons between players and teams, allowing users to analyze head-to-head matchups in greater detail.

  • Enhanced UI: Improving the web interface to include interactive charts and graphs, helping users visualize player and team performance metrics.

  • Expanding Beyond VALORANT: After refining my solution for VALORANT, I plan to extend Valorant Scout to support other competitive esports titles like CS and Overwatch, using a similar data-driven approach.

THANK YOU TO DEVPOST, AWS, AND RIOT FOR THIS WONDERFUL OPPORTUNITY! I HAVE LEARNT SO MUCH DURING THIS HACKATHON!

Built With

Share this project:

Updates