Inspiration
How can we develop sustainable AI frameworks that minimize unnecessary computation by encouraging more mindful and intentional user interaction, thereby reducing the environmental footprint of AI usage? Inspired by the extensive debate over AI water usage online, our extension visualizes AI usage as a living aquarium, where each user query consumes resources (water), encouraging more mindful and sustainable interaction with AI systems.
What it does
aQuerium is an AI water usage tracker in the form of a desktop aquarium that activates on artificial intelligence platforms, like ChatGPT or Claude. When the user enters a query into an AI model, the water level in the virtual aquarium drops. But don’t let the water drop too low—the fish won’t survive! If the user fully empties their aquarium, they can reset their query limit (max 50 queries) once a day and check their statistics in the extension window.
How we built it
We both have prior experience in making Chrome extensions, which are slim applications that can easily be ported over to multiple sites. Using this framework, we built a desktop aquarium that tracks the user’s keyboard activity in JavaScript and drains water when they click “enter.” aQuerium also maintains a persistent water level across different AI sites with global state trackers. If the aquarium is blocking anything on the user’s browser, the user can drag and drop it to a different part of their screen.
Challenges we ran into
We had some difficulty in figuring out what features to prioritize. Because our application is fairly straightforward, we had a lot of freedom in choosing how we wanted to convey our sustainability mission. Ultimately, we decided to keep our application lightweight: it should be a visual desktop aquarium that tracks water usage, and not much more. Any other features, like disabling use of AI websites after the water is completely drained, would punish the user too harshly. aQuerium gently nudges users towards more sustainable habits. We intentionally create friction, but we don't punish the user.
We also ran into a nuanced research problem—what water usage numbers are actually reliable in this field, and what do they actually mean? What aspects of the generative AI process are they referring to? Our water-usage numbers are based on peer-reviewed research indicating that large language models consume approximately 2L of water per 50 queries for data center cooling.
Sources: Li et al. (2023), "Making AI Less Thirsty" (https://arxiv.org/abs/2304.03271); OECD AI Policy Observatory (https://oecd.ai/en/wonk/how-much-water-does-ai-consume); Microsoft (https://www.microsoft.com/en-us/corporate-responsibility/sustainability/report) & Google (https://sustainability.google/reports/) sustainability reports (2024-2025)
Accomplishments that we're proud of
We’re quite happy with aQuerium’s UI/UX. Neither of us on the team are designers, but we managed to make something that’s fun to use and nice to look at! We’re also grateful for team cohesion and collaboration—when we debated numbers and punishment metrics, we both had attitudes of curiosity and a willingness to compromise, which resulted in a product we’re both happy with.
What we learned
We learned a ton about frontend, especially JavaScript.
What's next for aQuerium
aQuerium currently uses one constant for water usage—2L per 50 queries for data center cooling, or 40 mL per query. To more accurately reflect water usage, aQuerium can potentially track the “difficulty” of a user’s query via metrics like word count, or size of the file upload, and scale the water usage constant according to those. In the future, we could also implement more fun features like bubbles in the tank, or allowing the user to interact with their fish.
Built With
- chrome
- css
- html
- javascript
- webanimation

Log in or sign up for Devpost to join the conversation.