🌱 Inspiration
As AI models grow larger and more powerful, their environmental cost is often invisible.
While developers track accuracy and latency, almost no one tracks energy or carbon impact.
I wanted to change that — to make sustainability data as easy to capture and understand as a log file or metric dashboard.
CarbonWise AI was born from a simple question:
“What if measuring AI’s carbon footprint was as effortless as measuring performance?”
⚙️ What It Does
CarbonWise AI helps teams measure, optimize, and prove the environmental efficiency of their AI workloads.
- Measure: A lightweight Python decorator (
@track) logs energy (kWh), CO₂e, latency, and Software Carbon Intensity (SCI) for any AI run. - Optimize: Teams can compare “baseline” vs “optimized” runs to quantify improvements in energy and latency.
- Prove: The React dashboard visualizes results and generates one-click PDF reports with clear before/after metrics.
- Advise: The Region Advisor uses ASDI (Amazon Sustainability Data Initiative) data to recommend greener compute regions or time windows.
- Extend: Integrations with Hathora (for cloud inference) and ElevenLabs (for audio summaries) show how sustainability data can be embedded across the AI lifecycle.
In short — it’s sustainability you can see, measure, and share.
🧠 How We Built It
- Backend: Python with CodeCarbon for energy and CO₂ estimation.
- Frontend: React + TypeScript using Lovable, Vite, Tailwind, and Recharts for visualization.
- Data: Regional carbon intensity data from ASDI, stored in a simple JSON file.
- Reports: Generated with ReportLab and html2pdf.js.
- Integrations:
- Hathora to deploy or call remote inference models with the same tracking flow.
- ElevenLabs API to generate spoken summaries of sustainability results (
carbonwise_summary.mp3).
- Hathora to deploy or call remote inference models with the same tracking flow.
Everything connects through a single log file (run_log.jsonl), making the workflow easy to reproduce and share.
🧩 Challenges We Ran Into
- Getting accurate energy readings on ARM-based CPUs (Snapdragon) required digging into CodeCarbon internals and creating fallbacks.
- Managing multiple integrations (ASDI, Hathora, ElevenLabs) under hackathon time constraints was intense — lots of API testing!
- Balancing accuracy vs accessibility — I wanted it simple enough for students to use, but credible enough for researchers.
- Rendering the dashboard and PDF in different environments while keeping data consistent.
🏆 Accomplishments That We're Proud Of
- Built a complete end-to-end sustainability pipeline — measure → visualize → prove — in under 48 hours.
- Integrated open environmental data (ASDI) directly into an AI developer tool.
- Added a voice narration layer with ElevenLabs to make results accessible and engaging.
- Created a public, reproducible toolkit that anyone can use to understand and reduce their AI footprint.
📚 What We Learned
- Sustainability is measurable — and developers want to measure it, they just need approachable tools.
- Open data (like ASDI) can empower practical, real-world applications beyond research papers.
- Great hackathon projects aren’t about scale — they’re about clarity, creativity, and solving a real problem elegantly.
- AI + sustainability doesn’t have to be complicated; it just has to be visible.
🚀 What's Next for CarbonWise AI — Measure → Optimize → Prove Sustainable AI
- Add a Carbon Budget Mode that warns or halts a run if it exceeds a CO₂e threshold.
- Create a CI/CD badge that tracks and displays Software Carbon Intensity over time.
- Improve the Region Advisor using hourly ASDI data for dynamic scheduling.
- Publish the SDK as
pip install carbonwise. - Partner with AI education initiatives to bring sustainability tracking into classrooms and research labs.
Ultimately, the goal is to make sustainability an everyday part of how we build, test, and ship AI systems.
Built With
- asdi
- codecarbon
- data
- hathora
- lovable
- python
- react
- reportlab
- tailwind
- typescript
Log in or sign up for Devpost to join the conversation.