🚀 What It Does
This project uses Fetch.ai agents and REST API endpoints to create a smart, interactive AI system. It includes:
- Real-time PDF ingestion and parsing.
- Storing and embedding document content for similarity-based search.
- Retrieving relevant document chunks based on user queries.
- Generating insightful answers and interactive charts using AI agents.
Although inspired by systems like Pathway, we built our own real-time vector storage and retrieval pipeline without using Pathway directly.
🛠️ How We Built It
We designed a modular system using:
- Fetch.ai UAgents for handling all core functionalities:
- PDF processing (upload → base64 → parsed JSON → TXT narration).
- Embedding generation and similarity search.
- Query interpretation and dynamic response generation.
- REST APIs to interact with agents from the frontend.
- MongoDB Atlas to store uploaded PDFs, parsed data, and processed text.
- A custom-built embedding and retrieval pipeline that mimics real-time document search.
- A Streamlit frontend for file upload, querying, and chart visualization.
⚔️ Challenges We Faced
- Understanding the documentation for Fetch.ai agents and UAgent communication.
- Designing a seamless pipeline without relying on external tools like Pathway.
- Handling real-time data flow and efficient vector search.
- Ensuring consistent communication across REST APIs, agents, and the frontend.
🏆 Accomplishments We're Proud Of
- Built a working RAG (Retrieval-Augmented Generation) pipeline end-to-end.
- Designed our own vector storage and similarity search system from scratch.
- Achieved smooth integration between multiple components — backend agents, MongoDB, and Streamlit frontend.
- Delivered a modular, extensible system ready for real-time document analysis and financial insights.
This project demonstrates how you can build a real-time, agent-driven AI system using Fetch.ai and RESTful design — without relying on external vector databases or frameworks.
Built With
- fetch.ai
- streamlit
Log in or sign up for Devpost to join the conversation.