Inspiration

Lots of energy is wasted every single day for every prompt used in an LLM. As an everyday LLM user, I want to be more conscious in how much energy is wasted everyday.

What it does

PlanetLLM is a web extension that calculates energy usage per prompt in ChatGPT. The information is shown as an overlay above the input of the LLM as well as under each prompt.

How we built it

I built PlanetLLM using HTML, CSS and vanilla JavaScript. I sourced from different research papers to find the average energy usage per token. I used Tiktoken.js, a library that counts the total amount of tokens used in a prompt. Multiplied that by the average energy usage per token, I can calculate how much water, carbon, and electricity (energy) is used per prompt.

My compiled variables for each token is: ENERGY_PER_TOKEN = 0.000002; WATER_PER_KWH = 0.5; CO2_PER_KWH = 0.4;

And here is the simple calculation: energy = tokens * ENERGY_PER_TOKEN; water = energy * WATER_PER_KWH; co2 = energy * CO2_PER_KWH;

From this the data will then displayed for each prompt or it is automatically updated when you are about to type something in the prompt box.

Accomplishments that we're proud of

I am proud that I became more aware in how power hungry LLMs are. Being able to manipulate current websites by adding a custom overlay made me feel great! It is something I would use at my day to day activities.

What we learned

I learned the fundamentals of creating an web extension. Not to mention learning how to use Rollup.js, which was a build tool for bundling JavaScript code.

What's next for PlanetLLM

My next goal is to integrate PlanetLLM into other LLM models such as Gemini, Grok and Claude.

Built With

Share this project:

Updates