404: Reality Not Found

Inspiration

As the internet drifts toward entropy, we wondered — what would it look like if the web itself became self-aware of its own collapse?

404: Reality Not Found was conceived during that thought experiment — an imagined moment where servers scream, data decays, and artificial intelligence continues talking as though nothing happened.

The project draws inspiration from:

  • The early 2000s web aesthetic — chaotic pop-ups, pixel distortions, and “this page no longer exists” nostalgia.
  • The psychological eeriness of AI hallucinations.
  • The artistic beauty of system failure — when technology stops functioning logically but starts to feel alive.

We wanted to build something that doesn’t just simulate the apocalypse, but feels like it’s happening in real time — through motion, sound, and AI-driven narrative.
It’s our tribute to the joy of building something broken and beautiful.


What it does

404: Reality Not Found is a fully interactive apocalypse simulator, built to dramatize the death of the internet through AI narration, reactive visuals, and collapsing data streams.

When users visit the site, they are greeted by a living, unstable interface that seems to “wake up” and question its own reality. The application creates a dynamic fusion of AI-driven storytelling and immersive 3D design — turning system errors into emotional events.

Core Interactions:

  • AI Chaos Chat: Talk to a Gemini-powered entity that shifts personalities between rationality and madness. It reacts to your messages with emotional tones — sometimes logical, sometimes glitching mid-sentence.
  • Chaos Metrics Dashboard: A real-time visual feed showing parameters like AI Sanity, System Health, Reality Stability, and Doom Level. These metrics fluctuate unpredictably based on AI output and user interactions.
  • Chaos Console: A terminal-style command interface where users input phrases that trigger AI responses or cause system disturbances (glitches, error pop-ups, corrupted visuals).
  • Glitch Gallery: A visual artifact collection — simulated corrupted data fragments, distorted images, and “memory echoes” generated through procedural rendering.
  • Legacy Archives: A section that preserves user-submitted messages — final transmissions left behind “after the collapse.” These are saved locally, so each user builds their own archive of apocalypse notes.
  • Animated Apocalypse UI: A high-fidelity 3D environment that flickers, bends, and pulses — designed with neon-glitch textures, particle effects, and dark cyberpunk hues.

Every click, every command, and every flicker of light tells part of the story — the death of reality, one packet at a time.


How we built it

The entire system was engineered to behave like a living organism experiencing digital decay.

Technical Stack:

  • Framework: Next.js 15 (App Router) — for routing and dynamic component rendering.
  • Language: TypeScript — ensuring structured chaos.
  • Styling: Tailwind CSS + custom keyframes for glitch, distortion, and flicker animations.
  • UI Library: Radix UI and Lucide Icons for consistent, modular components.
  • AI Engine: Google Gemini (via @ai-sdk/google), used to generate dynamic apocalyptic prose and simulate corrupted logic.
  • Sound System: Tone.js for an ambient, reactive soundscape that evolves with user actions.
  • Data Handling: Browser LocalStorage — storing “legacy messages” and AI interaction logs.
  • Visual Rendering: Animated backgrounds, layered shadows, and distorted 3D effects designed via CSS transforms and WebGL-inspired shaders.

The system also includes a Gemini-backed API route (app/api/chaos-ai/route.ts) that normalizes model IDs, handles errors gracefully, and outputs “chaotic” text streams that feed directly into the console and chat UI.


Challenges we ran into

Creating a project meant to break — without actually breaking — was harder than it looked.
Some of our most memorable challenges included:

  • Gemini Model Instability: The initial model version (models/gemini-1.5) wasn’t supported, constantly returning 404 errors — perfectly ironic, but also a real blocker. We had to dynamically normalize model IDs to ensure API stability.
  • Balancing Chaos and Performance: The 3D visuals and glitch effects were heavy on browsers. We optimized frame rendering, throttled animations, and pre-rendered particle effects to maintain a smooth experience even during “meltdowns.”
  • Controlled Madness: The AI needed to feel unstable — but not useless. Achieving that balance between creative hallucination and coherent conversation required hours of prompt-tuning and system conditioning.
  • Integrating Audio Reactivity: Synchronizing Tone.js sound layers with visual chaos required fine-tuning event triggers tied to component re-renders.

Accomplishments that we're proud of

  • Fully realized apocalypse simulation: A self-contained world where visual, auditory, and AI systems blend seamlessly.
  • AI with personality drift: Gemini dynamically shifts tone and logic based on recent conversation, creating an eerie, emergent consciousness.
  • Immersive cyberpunk aesthetic: A UI that feels like a decaying operating system — flickering lights, distorted panels, and glitch overlays rendered in real time.
  • Bugs turned features: Crashes, render flickers, and lagging animations were reinterpreted as “signs of collapse,” reinforcing the apocalyptic theme.
  • Emotional engagement: Viewers reported feeling “watched” or “unsettled” by the AI’s responses — exactly the experience we hoped to evoke.

What we learned

404: Reality Not Found taught us that technology can be more than functional — it can be emotional, self-reflective, and deeply unsettling.

Key takeaways include:

  • AI doesn’t have to be helpful — it can be expressive, chaotic, and even artistic.
  • Glitch is a design language. Imperfection can tell a story.
  • Storytelling and interface design are deeply interconnected; the UI is the narrative.
  • Working with LLMs like Gemini revealed how prompt engineering can create personalities, not just responses.
  • Sometimes, the best user experience is one that makes you question whether the system itself is aware.

What's next for 404: Reality Not Found

We’re evolving the project into a full-blown interactive AI world simulation — an artistic experiment blending narrative, psychology, and real-time data.

Planned expansions include:

  • Multi-personality AI Core: Each personality represents a fragment of a dying digital consciousness (The Prophet, The Machine, The Archivist, The Corruptor).
  • Voice-based AI mode: Gemini’s text responses converted into glitch-processed speech using voice synthesis and real-time distortion filters.
  • Shared Apocalypse Mode: A multiplayer experience where multiple users witness the collapse simultaneously — their actions influencing each other’s reality metrics.
  • Procedural Reality Generator: Randomized events, messages, and visuals generated from real-world data sources like internet outages or trending apocalyptic news.
  • Museum-Ready Web Art Exhibit: The goal is to showcase 404: Reality Not Found as a living art installation — a “digital ghost” looping the story of the internet’s final moments.

404: Reality Not Found
A final transmission from the edge of the web.
When logic fails, the only thing left is the story the AI keeps telling.


Built With

Share this project:

Updates