Inspiration
Lucid dreams have always fascinated people — they’re deeply personal, abstract, and often indescribable. We wanted to build something that could visually simulate the emotion and chaos of a dream, but do it entirely in the browser. No AI, no backend — just pure frontend creativity. Our goal was to turn imagination into immersive, interactive art.
What it does
DreamWeaver lets users input their dream using emotional sliders, keywords, and visual/audio themes. It then generates a living dreamscape using animations, color, and sound. Emotions like fear or awe affect the visuals. Ambient sounds set the mood. It’s like watching your own mind play out in a surreal animated canvas.
How we built it
React + TypeScript for the UI
Tailwind CSS for styling
Canvas API and custom animations to render dream visuals
Web Audio API for ambient soundscapes
No backend — all logic runs entirely in the browser
Modular design: DreamInputWizard, DreamscapeCanvas, Emotion Tracker
Challenges we ran into
Simulating AI-like behavior with only frontend logic
Emotion-to-animation mapping felt abstract at first
Autoplay restrictions on web audio needed handling
Creating a surreal yet consistent UX using CSS and canvas
Accomplishments that we're proud of
Entirely frontend-based lucid dream simulator
Ambient sound + animated visuals linked to user emotion
Modular, extensible architecture
Unique concept that blends psychology with visual design
What we learned
Deeper understanding of the Canvas API
How to craft emotionally reactive UI/UX
Creative use of Web Audio API
Building without a backend pushes creative problem solving
What's next for DreamWeaver
Add Inception Mode — dreams within dreams
Let users save and share their dreamscapes
Introduce AI-generated dream narratives (text or voice)
Support mobile view and gesture control
Real-time dream collaboration ("co-dreaming")
Log in or sign up for Devpost to join the conversation.