Inspiration
We live in an era of "hyper-functional" code. We build apps to be faster, more efficient, and more logical. But as a student of Artificial Intelligence, I began to wonder: Where is the soul in the machine? Binary Breath was inspired by the idea that code can be more than just instructions; it can be a biological extension of the user. I wanted to create a digital space that breathes when you breathe and reacts when you feel.
What it does
Binary Breath is an interactive "living canvas" that transforms human input into generative art.
The Sentiment Engine: As you type a personal essay or poem, the app performs real-time sentiment analysis. Positive words breathe warmth and soft, circular shapes into the canvas, while "heavy" or tense words introduce sharp, angular geometries and colder hues.
The Audio Pulse: Using the Web Audio API, the background of the "About Me" section ripples and fluctuates based on the ambient sound or the frequency of the user’s voice, making the portfolio feel like a living organism.
Emotional CSS: If the user types with high intensity or speed, the interface begins to "glitch" and shake, visually representing the friction of human thought through code.
How we built it
The project was "vibe-coded" and prototyped using Google AI Studio with the Gemini 1.5 Flash model.
Frontend: Built with React for structure and Tailwind CSS for the minimalist, "art-gallery" aesthetic.
Creative Engine: I integrated p5.js for the generative canvas elements and Matter.js to give the words physical properties like gravity and tension.
Logic: We used a lightweight sentiment analysis library to map string values to HSL color coordinates and vertex counts.
Challenges we ran into
The biggest hurdle was the "Logic vs. Emotion" trade-off. Initially, the code was too rigid—the shapes were too perfect. I had to intentionally introduce "noise" into the algorithms to make the art feel more human and less "computer-generated." Additionally, navigating the temporary deployment limitations in AI Studio forced us to pivot toward a manual Cloud Run setup to ensure the "living" aspects of the site remained performative and lag-free.
Accomplishments that we're proud of
I am incredibly proud of the "Fluid Portfolio" section. It’s one thing to make a website; it’s another to make a website that "listens." Successfully mapping the Web Audio API to CSS variables so that the site’s typography pulses to the rhythm of the room was a "eureka" moment for the team.
What we learned
This project taught me that bugs can be beautiful. In traditional software engineering, a "glitch" is a failure. In Binary Breath, a glitch is a form of expression. I learned how to use AI not just as a co-pilot for fixing errors, but as a creative collaborator that can help translate abstract emotions into functional syntax.
What's next for Binary Breath
We plan to integrate Biometric Input. Imagine the canvas reacting to your actual heart rate via a smartwatch or your facial expressions via a webcam. We want to move beyond text and sound, eventually turning Binary Breath into a full-scale immersive installation where the code doesn't just reflect who you are—it reflects how you live.
Built With
- gemini
- googleaistudio
- matter.js
- p5.js
- tailwind
- vite
Log in or sign up for Devpost to join the conversation.