Inspiration
The inspiration for this project stems from the nostalgic concept of a "Mood Ring," but reimagined for the digital age. I initially envisioned "The Internet Mood Ring", a tool to scrape the web and visualize the collective emotional state of the internet. However, I wanted to create something more profound and original than a simple data visualizer.
I was struck by a quote from content creator Anthony Po: "The Internet can feel so negative sometimes... But sometimes people kind of just want something stupid, fun and wholesome." This resonated with me. I wanted to build a "Sanity Orb", a digital consciousness that you can interact with, serving as a metaphorical "therapist" for the internet. It acknowledges the chaos of our connected world while providing a mesmerizing, interactive space to visualize and influence that energy, turning raw digital noise into something beautiful and harmonious.
What it does
The Internet Sanity Orb is an immersive, interactive WebGL experience that visualizes "digital consciousness" as a dynamic, living 3D entity.
- Interactive 3D Visualization: At its core is a procedurally animated orb surrounded by a massive, volumetric Dyson sphere and rotating containment rings. The entire scene, colors, particle speeds, lighting, and energy harvesting streams, reacts in real-time to a "Sanity Level" slider.
- Generative Audio Soundscapes: As you adjust the sanity level, the audio environment shifts dynamically. It transitions from dark, detuned bass drones in "Critical" states to harmonic, ethereal ambient pads in "Optimal" states, all synthesized in real-time using Web Audio.
- AI-Powered Predictive Analytics: The system doesn't just look pretty; it learns. It uses XGBoost machine learning models to analyze interaction patterns, predicting future sanity trends and classifying user sessions into states like "Stable," "Unstable," or "Critical."
- Humorous Feedback: To keep things "wholesome," the system provides Minecraft-style pop-up messages and visual feedback (like screen shakes) when the digital consciousness becomes unstable.
How I built it
The Frontend (The Visuals):
- Built with React 19 and TypeScript for a robust UI.
- Three.js & WebGL: I wrote custom GLSL shaders for the orb and the Dyson sphere to achieve volumetric lighting and energy effects that standard materials couldn't offer. The "energy harvesting" beams use a custom particle system that flows mathematically from the orb's surface to the containment shell.
- Tone.js: I engineered an AudioManager class that uses polyphonic synthesizers and LFOs to procedurally generate music. It's not playing MP3s; it's composing music on the fly based on the state.
- State Management: I used Zustand to synchronize the 3D scene, audio engine, and UI components instantly without prop-drilling hell.
The Backend (The Logic):
- A Node.js and Express server handles user sessions and global statistics.
- I used PostgreSQL with Sequelize for structured data storage and Redis for high-performance caching and rate limiting.
- Security was a priority, implementing enterprise-grade rate limiting and request fingerprinting to prevent abuse.
The Brain (The AI):
- A separate Python/Flask microservice hosts the Machine Learning models.
- I used XGBoost to train three distinct models: a Regressor for trend prediction, a Classifier for state categorization, and a Confidence model.
- The models were trained on synthetic data that mimics realistic user interaction patterns (session duration, stress levels, interaction frequency).
Challenges I ran into
- The "Volumetric" Look: Creating the Dyson sphere look like a dense, energy-harvesting structure rather than a hollow plastic ball was tough. I had to learn advanced GLSL shader programming, specifically using multi-layered noise functions and transparency blending to create the "Interstellar" style volumetric effect without killing the frame rate.
- Performance vs. Beauty: Rendering thousands of particles, multiple transparent layers, and post-processing (bloom/glow) is heavy. I had to optimize the Three.js render loop extensively, using object pooling for particles and efficient geometry reuse to maintain a smooth 60 FPS.
- Bridging the Gap: Connecting the React frontend, Node backend, and Python ML service required careful architectural planning. ensuring the frontend could talk to the ML service (via the backend proxy) with low latency was a challenge, especially when handling real-time predictions.
- Procedural Audio Tuning: getting the Tone.js synthesizers to sound "musical" and not just like random noise required a lot of fine-tuning of ADSR envelopes and filter frequencies to match the visual mood perfectly.
Accomplishments that I am proud of
- The Visuals: I am incredibly proud of the custom shaders. Seeing the Dyson sphere's energy streams pulsate and change color in perfect sync with the music is mesmerizing.
- Full-Stack Integration: Successfully building a "tri-force" architecture (React/Node/Python) where every piece serves a distinct purpose and works harmoniously.
- The "Vibe": I feel I successfully captured the abstract concept of "internet sanity" and translated it into a tangible, audiovisual experience that is genuinely fun to play with.
- AI Integration: Actually implementing functional ML models that return meaningful probabilities and trends, rather than just using a random number generator.
What I learned
- GLSL is Magic: I learned that if you want visuals that stand out, you have to leave standard libraries behind and write your own shaders.
- The Power of Web Audio: I learned that browsers are capable of studio-quality sound synthesis if you know how to wield the Web Audio API.
- Microservices Architecture: I gained a deeper appreciation for separating concerns, keeping the heavy ML number-crunching in Python while letting Node handle the high-concurrency API traffic.
- User Experience: I learned that "polish" (smooth transitions, help overlays, loading animations) turns a tech demo into a product.
What's next for The Internet Sanity Orb
- Real Data Integration: Connecting the orb to live APIs (Twitter/X, Reddit) to drive the "Global Sanity" level with actual real-time sentiment analysis.
- Multiplayer Consistency: Implementing WebSockets so multiple users can "stabilize" the same orb together in real-time.
- VR Experience: Porting the Three.js scene to WebXR so users can stand inside the Dyson sphere and watch the energy harvesting all around them.
Built With
- docker
- express.js
- flask
- helmet.js
- jwt
- node.js
- numpy
- pandas
- postgresql
- python
- railway
- react
- recharts
- redis
- scikit-learn
- sequelize
- tailwind
- three.js
- tone.js
- typescript
- vercel
- vite
- webgl
- xgboost
- zustand
Log in or sign up for Devpost to join the conversation.