Inspiration

  • We noticed that AI tools like ChatGPT are powerful but often waste tokens due to poorly optimized prompts.

  • With growing awareness about the environmental impact of large language models, we wanted to help users be more efficient and sustainable in how they interact with AI.

  • We were inspired by tools that measure carbon footprints — and thought, what if we could do the same for prompt efficiency?

What it does

  • Tokenless helps users write more concise and effective prompts by correcting spelling and suggesting rewordings in order to use fewer tokens.

  • It provides real-time feedback on token count, estimated cost, and even the energy and water usage associated with each prompt.

  • The tool tracks users’ cumulative savings over time — showing how their improved prompt efficiency contributes to reduced environmental impact.

How we built it

  • Built as a Chrome Extension that integrates directly into chat interfaces (like ChatGPT).

  • Used JavaScript, HTML/CSS, and Webpack for frontend integration and dependency management.

Challenges we ran into

  • Getting the Chrome Extension to interact smoothly with AI chat UIs.

  • Parsing token counts accurately from user input and model responses.

  • Managing dependency issues and bundling with Webpack.

  • Solving merge conflict between branches.

  • One of our team member's apartment burned down 🙃

Accomplishments that we're proud of

  • Built our first Chrome Extension from scratch!

  • Successfully implemented a prompt optimization algorithm that reduces token usage.

  • Designed an intuitive UI and impact dashboard that makes sustainability visible.

  • Created a tool that’s not only practical but also raises awareness of AI’s environmental footprint.

  • Two of our team members were first time hackers (one isn't even a Computer Science student) but the team meshed extremely well!

What we learned

  • How to create and publish browser extensions.

  • How tokenization works and how to optimize prompts effectively.

  • The relationship between compute cost and environmental impact in AI models.

  • How to collaborate and iterate quickly under hackathon time constraints.

What's next for Tokenless

  • Add multi-platform support (e.g., integration with Notion AI, Gemini, Claude).

  • Build a leaderboard or gamification system to reward efficient prompting.

  • Improve environmental stats with more accurate carbon intensity data.

  • Launch an open-source version so developers can extend Tokenless further.

  • Explore enterprise integrations to help teams optimize AI costs sustainably.

Built With

Share this project:

Updates