Inspiration
Every time we use a chatbot, there’s an invisible but significant energy cost.
As AI models become larger and more complex, the amount of energy needed to train and run them keeps growing. By 2030, data centers could use up to 20% of global electricity, with AI inference responsible for around 60% of that demand (UPenn, Penn State).
The people most affected by water shortages, pollution, and resource depletion are often not the ones who benefit from AI’s progress.
We started Alba with a simple question: what if being sustainable with your AI prompts could also save money?
That idea became Alba, a tool that helps companies, users, and the planet by making AI more efficient.
What it does
Alba is like Grammarly for AI prompts. It helps users write clearer, more efficient prompts that reduce token waste, lower costs, and cut emissions.
- Every query to an LLM produces CO₂ emissions that vary based on the model and complexity (Frontiers, 2025).
- Long and reasoning-heavy prompts can emit up to 50× more CO₂ than concise ones.
- Smart prompting can reduce energy use by up to 99% without losing accuracy (Rubei, 2025).
By simplifying prompts, cutting redundancy, and improving clarity, Alba helps users get faster and higher-quality results while reducing environmental impact.
How we built it
We built Alba using research on energy-efficient LLM inference (Stanford CRFM, 2024; Fernandez, 2025).
Our system converts token counts into energy estimates (around 0.3–10 Wh per query) and uses an AI model to identify unnecessary text. It then suggests improved versions of prompts that keep the same intent while using less energy.
We also calculate the estimated environmental impact of each prompt, including energy use, carbon emissions, and water consumption.
Challenges we ran into
One of the biggest challenges was the lack of public data on the environmental impact of AI prompts and responses. We had to rely on academic papers, benchmarks, and sustainability reports to estimate per-token energy usage.
Adapting Alba to different LLM platforms was another challenge. Claude, for example, blocked our extension from displaying, which made us rethink how our tool interacts with various browser and API environments.
On top of that, none of us had prior experience with JavaScript or UI design, so creating the interface was a steep learning curve. We used AI tools, online resources, and mentor guidance to bring our idea to life.
Accomplishments that we're proud of
We built a working prototype that calculates real-time energy and CO₂ savings for every AI prompt. Seeing the impact of efficient prompting in real numbers made our mission feel real.
We also got Alba running on multiple AI models, including ChatGPT and Gemini, which made it easier for everyday users to benefit from our tool.
One of our favorite features is Eco Wrapped, which summarizes each user’s total energy and emission savings in familiar terms, such as how many phone charges or hours of light they’ve saved. It turns sustainability into something visible and rewarding.
What we learned
We learned that AI’s environmental footprint is not just a hardware problem but also a human one.
Every prompt matters. Writing smarter prompts reduces trial and error, shortens conversations, and cuts down on wasted compute. Optimized prompting doesn’t just help the planet—it also makes AI faster, cheaper, and more effective.
What's next for Alba
We plan to release Alba as a browser extension and API plugin for platforms like Claude and Perplexity.
Upcoming features include:
- Impact dashboards that show total energy and emission savings.
- Team analytics for organizations to track their AI efficiency.
- Partnerships with sustainability and ESG groups to encourage responsible AI use at scale.
Alba makes sustainability part of every AI interaction. Better prompts mean a better planet. 🌍

Log in or sign up for Devpost to join the conversation.