Inspiration
As a student juggling multiple jobs and trying to save up for my future, I realized something — I was working hard, but I didn’t really understand where my money was going or how it was growing.
I’d open my Cash App or Robinhood and see a few stocks like NVIDIA or Tesla and think, “Okay… but what does this really mean? Am I doing well? Should I buy more? Should I sell?” Finance felt complicated. Full of numbers, graphs, and terms I didn’t fully grasp. And like many young people, I wanted to invest but didn’t know how to actually see what was happening with my money.
That’s when the idea for Celestial Stock was born.
We wanted to make finance something people could visualize and interact with, not just read about. What if your portfolio wasn’t just a list of numbers, but a living galaxy — where every company you invest in becomes a star.
Through this idea, we wanted to reimagine how people, especially students and new investors, learn about the financial world. Our goal became clear: to make financial literacy intuitive, immersive, and interactive, turning something intimidating into something beautiful and engaging.
What it does
Celestial Stock is a galaxy themed stock management platform that helps users analyze and understand their investments in a completely new way.
Each stock is represented as a star, whose size dynamically changes based on real time market value and the color changes based on if the stock is increasing or decreasing from the previous market value. The sun in the center symbolizes the users total portfolio which expands or shrinks based on the users overall gains or losses.
Once the user clicks a star, it reveals a graph of the stock’s performance over the past five years, and future projections based on historical trends. An integrated AI chatbot can also help users make beneficial financial choices by explaining complex terms, market trends, and stock behaviors in a simplified language.
How We Built It
We built Celestial Stock by combining real-time market data, AI-powered insights, and a 3D interactive frontend to create an immersive financial galaxy.
Workflow Overview
1. Data Flow
We used yfinance, Google News RSS, and Reddit API to pull live stock prices, financial headlines, and social sentiment. All incoming data is handled asynchronously using asyncio and httpx for non-blocking parallel fetching.
2. Analysis Layer
The Sentiment Worker runs FinBERT (via PyTorch and Hugging Face Transformers) to classify market tone as Buy, Hold, or Sell. Batch processing allows multiple texts to be analyzed in a single GPU pass for speed and efficiency.
3. Intelligence Layer
The SambaNova Cloud AI platform powers our conversational system using the Llama-4-Maverick-17B model. The Chat Worker manages natural language responses and multi-turn context while ensuring factual accuracy.
4. Visualization Layer
The React + Three.js frontend renders a 3D galaxy: Each stock is a star whose size reflects market value and color represents price change. The sun symbolizes the total portfolio, dynamically resizing based on overall gain or loss. Clicking a star opens Chart.js graphs showing historical performance and future projections.
5. Alert and Voice Workflow
The Alert Router monitors price thresholds and sentiment changes, exporting alerts as JSON files. The system is Alexa-ready, enabling voice-driven updates such as: “Hey Alexa, what’s going on with NVIDIA today?”
Tech Stack Summary
Frontend: React, Three.js, Vite, Chart.js Backend: Python (FastAPI), asyncio, httpx, yfinance, Reddit API, Google RSS AI/ML: PyTorch, Transformers, FinBERT, SambaNova LLM Architecture: Worker-based asynchronous design with Observer and Singleton patterns Hosting: Vercel (frontend) and Python backend with dotenv and schedule automation
Challenges we ran into
Getting stock prices, sentiment data, and AI responses to flow together without lag was one of our biggest challenges. Each update had to feel alive, stars expanding, colors shifting, the galaxy pulsing, and all of it powered by real-time data. Even small drops in frame rate made the whole experience feel off, so we spent hours optimizing our async pipelines, caching strategies, and render updates to make everything seamless. We tested multiple galaxy and star textures and models and before landing on one that was both beautiful and functional. We wanted it to feel like space but act like a dashboard.
And then came the AI layer. Integrating our own fine-tuned sentiment model based on FinBERT alongside a conversational system powered by SambaNova’s Llama-4-Maverick-17B required deep coordination between the backend and frontend. Getting the model to understand context, analyze headlines, and respond quickly, all while data kept streaming in was very complicated, especially when none of had worked with transformers and hugging face before.
What We Learned
We learned how to: Bridge AI and APIs in real time. Async data flow, model concurrency, and parallel processing became second nature.
Design systems for both humans and machines. The galaxy isn’t just for show — every orbit, color, and pulse carries information about the portfolio.
Collaborate across domains. One of us focused on financial data pipelines, another on AI orchestration, another on rendering, and we learned how to weave it all together into a single, cohesive experience.
Accomplishments that we're proud of
Built a fully interactive 3D financial model in under 48 hours Designed a portfolio simulation system that teaches users about market behavior and risk.
Going into this, none of us had ever built a truly end-to-end system that combined real AI models, live data parsing, and 3D visualization, everything we learned here, we learned by doing.
We had to figure out how to design systems for both humans and machines. The galaxy isn’t just for show, every orbit, color, and pulse carries real meaning about the portfolio’s health. That meant learning how to translate numbers and algorithms into motion, emotion, and visual storytelling.
We also learned how to collaborate across domains. One of us focused on building the financial data pipelines, another handled the AI orchestration layer, and another managed the rendering and design system. None of it worked on its own, it only came together when we learned how to blend all our parts into one seamless workflow. And trust me, the time when it did eventually all come together had to have been the highlight of my year.

Log in or sign up for Devpost to join the conversation.