Inspiration
Investing in Stocks can get pretty boring and tedious, especially when researching through millions of lines between documents just to get a bit of relevant data. This website was made to streamline that process.
What it does
This program searches the web for relevant documents, news, and statistics on a company to compile it into relevant data for stockholders. The compiled data is then organized and presented to the user to assist them in valuating the company. It uses off-sheet data it may find as well as data from other sources to either add or discount a formula-based Net Current Asset Valuation model.
How we built it
We used GPT4o and Text-embedding 3 large to create the RAG Ai assistant. We used Mongo to store our data, and connected it into our front end via API calls.
Challenges we ran into
Our biggest challenge was integrating the RAG tools into the agent, which required lots of data fetching and cleaning. Packaging the agent up into a usable API and integrating it with its consumers was also challenging. Prompt engineering was a hurdle as well, because we had to both ensure the agent would produce accurate data while still giving it leeway to explore and research. Aside from that, we each know different things and have different skill levels. Our challenge here was diving up the tasks and overall project tech stack to include everyone. We solved this by having Brice, our PME and most knowledgeable of the tech stack on AI implementation and integration to the back end and front end. Sujin worked on the Backend and Database. Yousuf worked on the front end.
Accomplishments that we're proud of
First, the completion of this project was probably our biggest accomplishment. Our solution to the integration problem we had also was a pretty big accomplishment.
What we learned
We each learned something, either individually or as a whole. In general, we all learned that LLM modules can be looped to create a feedback system that allows it to be fed data while subsequently searching and updating relevant data. We also learned how RAG frameworks work. And finally, we all learned how to properly integrate an AI with a backend with a frontend. Individually, Brice learned how to work on RAG frameworks, Sujin learned how to work with Python, and Yousuf learned how a complete project is put together.
What's next for INFOGO
While the main functionality has been implemented, many quality features were left out due to time constraints. The first order for INFOGO is to add features such as uploading personal documents for scanning, personalized stock recommendation options based on user preferences, Saving previous searches in order of timeline, and enhancing comparative charts. After that INFOGO is production-ready, and will be put online for testing with the masses.
Built With
- .net
- asyncio
- beautifulsoup4
- c#
- langchain
- next.js
- openai
- pydantic
- pymongo
- python
- tailwind
- typescript
- yfinance
Log in or sign up for Devpost to join the conversation.