💱 Inspiration
The explosive growth of stablecoins, particularly PayPal's PYUSD, highlighted a need for accessible, real-time, and intelligent analytics. We were inspired to create a comprehensive dashboard that not only tracks transactions but also leverages cutting-edge AI like Google's Gemini to provide deeper insights and contextual understanding through news sentiment. The integration of MongoDB further enhances performance and data persistence. The "AI in Action" hackathon, with its focus on Google Cloud, MongoDB, and GitLab, provided the perfect catalyst to build a scalable, cloud-native "CyberMatrix" vision for PYUSD analytics.
🛰️ What it does
PYUSD CyberMatrix Analytics Dashboard v2.1 is an advanced Streamlit application deployed on Google Cloud Run, offering a rich suite of tools for PayPal USD on Ethereum. It provides:
- Efficient Event Feeds: Tracks PYUSD Transfers (with filtering/tagging), Mint/Burn events, and ERC20 Approvals, leveraging MongoDB for caching and GCP Blockchain RPC for fresh data.
- In-depth Analysis: Calculates PYUSD volume, visualizes top senders/receivers with address tagging, and offers an interactive network graph of PYUSD flows, powered by data from MongoDB and RPC.
- Historical Exploration: Allows analysis of selected events (Transfer, Mint, etc.) over user-defined block ranges, pulling from MongoDB cache or RPC.
- Blockchain & Address Utilities: Features an address balance checker (RPC), a MongoDB-backed watchlist for persistent monitoring, contract state display (RPC), transaction lookup (MongoDB/RPC), and a block trace explorer (RPC).
- AI-Powered Insights (Google Gemini): Includes an AI assistant for Q&A, news sentiment analysis for PYUSD-related articles, and AI-generated summaries for volume analysis.
- Contextual Information & Search: Integrates a NewsAPI feed for relevant articles, stored and searchable within MongoDB.
- Innovative Simulation: A conceptual demonstration of PYUSD payments via a simulated bio-implant.
- User-Friendly Features: Data export to CSV and a custom "CyberMatrix" themed UI.
🌌 How It's Built & Deployed on Google Cloud
PYUSD CyberMatrix Analytics v2.1 is built with Python, leveraging a modern, cloud-native stack:
- Application Framework: Streamlit for the interactive web UI.
- Data Persistence & Caching: MongoDB Atlas (via PyMongo) serves as the primary data layer, caching blockchain events, storing news articles, and managing the user address watchlist.
- Google Cloud Platform (GCP):
- Cloud Run: The serverless platform for deploying and scaling our containerized Streamlit application.
- Cloud Build & Artifact Registry: Provides the CI/CD pipeline to automatically build the Docker container from source and store the image for deployment.
- Secret Manager: Securely manages all API keys, the MongoDB URI, and the RPC endpoint, injecting them into the Cloud Run environment at runtime.
- Blockchain Node Engine: Provides a high-performance RPC endpoint for real-time Ethereum blockchain data.
- Gemini API: Powers all AI-driven features within the application.
- Blockchain Interaction: Web3.py for interfacing with the Ethereum blockchain via the GCP RPC endpoint.
- External Data: NewsAPI to fetch fresh news articles before they are processed and stored in MongoDB.
- Data Handling & Visualization: Pandas for data manipulation, Plotly for charts, and Pyvis for network graphs.
- Version Control: GitLab for source code management and as the foundation for our CI/CD workflow.
📡 Challenges we ran into
- Cloud IAM Permissions: Configuring the correct IAM roles (e.g.,
Secret Manager Secret Accessor,Logs Writer) for the Cloud Run service account was critical and a common source of initial deployment failures. - Managing Cloud Run Resources: Balancing the application's memory and CPU needs, especially for resource-intensive tasks like historical analysis, to prevent timeouts or crashes in the serverless environment.
- MongoDB Integration & Schema Design: Designing an efficient schema for blockchain events and news articles in MongoDB and ensuring seamless data flow between RPC fetches and database storage/retrieval.
- RPC Fallback & Caching Logic: Implementing a robust strategy for fetching data from MongoDB first, and then gracefully falling back to GCP RPC for new data or cache misses.
- Optimizing MongoDB Queries: Ensuring performant queries for event feeds, historical data, and news search, including proper index creation (e.g., compound indexes, text indexes).
- (Previous Challenges Still Relevant): ABI accuracy, Streamlit state management for complex data, and effective Gemini prompt engineering.
🪐 Accomplishments that we're proud of
- Successfully Deploying a Cloud-Native Architecture: Containerizing the application with Docker and deploying it on Google Cloud Run with a secure, scalable, and automated pipeline using Secret Manager, Cloud Build, and Artifact Registry.
- Robust Data Backend with MongoDB: Successfully integrating MongoDB as a caching and persistence layer, significantly improving data retrieval performance.
- Efficient News Caching & Search: Storing news articles in MongoDB and implementing a text search feature for quick access to relevant information.
- Persistent User Watchlist: Moving the address watchlist to MongoDB, allowing user data to persist across sessions.
- Hybrid Data Fetching Model: Creating a system that intelligently fetches data from MongoDB cache first, falling back to GCP RPC as needed, optimizing for both speed and data freshness.
- Meaningful AI Integration with Gemini: Utilizing Gemini for practical uses like news sentiment and data summarization.
🤖 What we learned
- Full-Stack Cloud Deployment: Gained end-to-end experience in deploying a web application on Google Cloud, from source code to a public-facing URL.
- Cloud-Native Security Practices: Mastered the use of Google Secret Manager for handling sensitive credentials, moving away from insecure local files and adopting best practices for a production environment.
- IAM & Service Accounts: Deepened our understanding of how IAM roles and service accounts control access between different cloud services, a fundamental concept in GCP.
- Containerization with Docker: Learned how to write a
Dockerfileto create a portable, reproducible environment for our Python application. - Automated Builds with Cloud Build: Understood how to use Cloud Build to automate the process of turning source code into a deployable container image.
- MongoDB for Application Data: Gained practical experience in using MongoDB for application data storage, caching, and querying, including schema design and indexing for performance.
🚀👨🏻🚀 What's next for Cryptocurrency & PYUSD CyberMatrix Analytics with Gemini 2.5
We envision several exciting future enhancements leveraging our cloud architecture:
- MongoDB Vector Search Integration: Generate embeddings for transactions or news and store them in MongoDB. Implement semantic search using MongoDB Atlas Vector Search to find similar entities or patterns.
- Advanced MongoDB Aggregation Pipelines: Utilize more complex MongoDB aggregation queries for richer, server-side data analytics before data is pulled into the application.
- Real-time Data Sync with MongoDB Change Streams: Explore using MongoDB Change Streams to update the dashboard dynamically as new data is inserted/updated in the database.
- Deeper GitLab CI/CD Integration for Google Cloud Run: Fully automate the deployment pipeline, so a
git pushto the main branch automatically triggers a new build and deployment on Cloud Run. - Google BigQuery Integration: Complement MongoDB by offloading very large-scale historical or batch analytics to BigQuery, using MongoDB as the operational/caching layer.
- User Accounts & Personalized MongoDB Storage: Introduce user authentication and store personalized watchlists, saved queries, and preferences in user-specific documents/collections within MongoDB.
Built With
- cryptocurrency
- ethereum
- gemini-2-5-pro
- gemini-ai
- gitlab
- google-cloud
- mongodb
- python
- streamlit



Log in or sign up for Devpost to join the conversation.