Inspiration

As avid Large Language Model (LLM) users ourselves, we wondered “to what extent does our LLM querying impact our environment?” and “to what extent does this generate Co2 equivalents, more so than a Google Search?”. Researchers from MIT and Northeastern have corroborated, the environmental costs of LLM inferencing outweighs those of training models, yet AI consumption remains on a super-linear path. Our inspiration derives from our desire to address the near-term reality of ignorance and environmental taxation due to global LLM consumption.

What it does

ELLM, is a web extension allowing users who currently desire an extensible, low-cost, and clean querying experience to track Co2e of their queries across models, understand their AI footprint in social math, and reschedule their queries using Batch API infrastructure.

How we built it

Firstly, our wonderful tech lead Irvan mocked up a mini version of what ELLM looks like today. He spent several hours subsequently creating code to rightsize the dimensions of the extension and the components within. All of the components as seen in our MVP are fully functional and can be downloaded if listed on the Chrome Web Store.

Challenges we ran into

  • Many iterations of the user interface to enhance UX (to closely mirror typical LLM UX expectations)
  • Ensuring that our business model and go-to-market flows seamlessly alongside our value proposition
  • Answering - What does it mean to be a financially sustainable social venture competing in the SaaS and HPC space?

Accomplishments that we're proud of

We're thrilled to have built a delay query button, instant query button, upload file button, chat area, carbon analytics and social match equivalents for all queries. We're also proud of harnessing the power of Batch APIs offered by LLM providers to incent user adoption of ELLM. Through Batch, we can list multiple models that are compatible with batch querying! This really services the user, as our target market typically experiences low switching costs across LLM types.

What we learned

So much about user behavior via our 50 person survey, and how LLM providers and enterprises aren't currently willing to build or productize a carbon-aware mechanism for their AI-integrated products. Additionally, we learned how to sidestep values alignment constraints through understanding the psychology of motivation and incentive (our product is deeply inspired by behavioral nudging!!) .

What's next for ELLM by CtrlShiftHumanity

A few features to come...

  1. Payment structure linking credit/debit account details (Batch API is generous in cost per token rate) users can expect to pay cents for long queries, and can "top up" account at their leisure
  2. Greater optionality & customizability in social math metrics
  3. Listing more models, across LLM providers (i.e Anthropic, Grok, etc)
  4. Introducing multi-modality in chat
  5. Refining User Experience to rival that of established LLM players

Built With

Share this project:

Updates