Inspiration

We wanted to see how AI can help everyday investors. Inspired by the complexities & seeming randomness of the stock market, we set out to generalize the decision-making process.

What it does

The screener trains on historical stock data and leverages a Random Forest model to predict daily buy/hold/sell decisions based on four key features (daily % change, volume change, and 5-day & 20-day moving averages). Users can simulate trading over any period and track would-be profit based on the model’s decisions. The user can also submit any a ticker & receive a buy/sell signal

How we built it

  • We started with detailed planning and created class diagrams to outline the structure.
  • We trained a Random Forest model on a dataset (from Kaggle) using four features: daily percent change, volume change, 5-day moving average, and 20-day moving average. The model predicts stock movement direction, which we classify as positive/negative. Frontend: We built a web interface that integrated FastAPI for our backend. Two primary functionalities were implemented: A simple classification (buy or don't buy). A simulation mode that follows the AI’s daily decisions from a user-defined start to end date, tracking total profit. Backend: Python served as the backend, with FastAPI handling API endpoints for model predictions and trade simulations. ## Challenges we ran into
  • Learning React in the midst of development slowed us down
  • We wasted the first hour attempting to build our own dataset before pivoting to Kaggle
  • At one point, a massive model file on GitHub began causing issues with commits & pulls, and we had to do a forced cleanup of the repo history ## Accomplishments that we're proud of
  • We gained hands-on experience with frontend developent and learned robust Git practices(the hard way :p)
  • We developed a functional AI tool that, while not beating the S&P 500 anytime soon, consistently averages profit over 5 years of simulated trading
  • We combined data preprocessing, ML, & web development into 1 cohesive project in < 24 hours ## What we learned
  • The importance of version control and managing repository bloat(a commit shouldn't take 4 minutes)
  • The value of planning & class diagrams ## What's next for Skibidi Sci-kit Stock Screener
  • We're planning to integrate NLTK to perform sentiment analysis on earnings reports, the news, and social media to add a qualitative layer to the predictions.
  • We played around with an "evil" portfolio, which would scrape articles & tweets to ethically evaluate companies, and trade based on how evil they were.

Built With

Share this project:

Updates