Inspiration
In my years as a Cloud Solutions Architect, one core principle I came to admire was load balancing, the technique of intelligently distributing workloads across multiple servers to ensure no single resource is overwhelmed. As part of my role, I supported over a hundred startups in my employer’s global Cloud Accelerator program—ranging from well-funded teams to solo founders in resource-constrained regions. This diversity made me realise the beauty of balance: optimising what’s available to ensure resilience, efficiency, and fairness.
That experience sparked a thought: What if we apply this same concept—not to servers—but to power infrastructure?
Telecom towers, especially in off-grid and hybrid energy environments, often struggle with uneven power demand, battery strain, or underutilised renewable assets. WattWiseAI was born from that realisation, a solution to intelligently balance energy loads between telecom base stations, minimising carbon output, reducing cost, and enhancing uptime, just like cloud load balancing, but for power. WattWiseAI is a sustainable solution inspired by cloud logic—redistributing excess energy across connected base stations to reduce carbon footprints, cut costs, and improve uptime across infrastructure networks.
What it does
WattWiseAI is an intelligent energy load balancer designed for telecom base stations in resource-constrained regions. It uses on-device AI to monitor power usage, forecast demand, and shift energy loads dynamically between neighbouring stations. This helps prevent overload, reduce fuel dependency, increase uptime, and improve energy efficiency—especially in hybrid-powered or off-grid telecom infrastructure. WattWise AI reduces carbon emissions, cuts operational costs, and improves network uptime.
How we built it
WattWiseAI was built to function offline-first, using TensorFlow Lite and Edge Impulse to train a compact predictive model that runs directly on microcontrollers without cloud dependency. The model analyses local time-series energy data and load demand patterns to anticipate overloads and trigger smart redistribution logic, even without internet access. WattWise AI was developed on a lightweight MQTT-based communication layer with offline queuing for inter-node messaging. The custom energy event extractor, built in JavaScript and optimised for TensorFlow.js compatibility, feeds the model with critical features in real-time. The dashboard offers optional cloud syncing, but the core logic executes independently at the edge.
Challenges we ran into
Ensuring load-balancing logic could operate in intermittent or fully offline scenarios. Creating a custom signal processor in JavaScript that mimics MFCC for energy events. Designing fault-tolerant MQTT communication with offline caching and smart retries. Emulating energy transfer logic in a simulated multi-node environment without physical telecom gear.
Accomplishments that we're proud of
Delivered an offline-capable, low-power AI system that dynamically balances energy demand at the edge. Simulated a working telecom environment with real-time base station handoffs and power prioritisation logic. Implemented inter-node collaboration without cloud dependency, using lightweight message protocols. Validated edge inference success with synthetic load scenarios and benchmarked real-time responses.
What we learned
Offline edge AI is achievable with creative optimisation and disciplined memory management. Building fault-tolerant communication between smart nodes requires careful queuing and recovery logic. Load balancing in constrained environments needs context-aware AI that adapts to missing or delayed data. Mapping cloud architecture to physical energy logic (like base station coordination) involves cross-domain thinking and a smart handshake between software and hardware.
What's next for WattWise AI
Begin hardware manufacturing of the TinyML devices for the base stations. Expand offline learning with federated model updates via mobile agents. Integrate solar power forecasting and prioritisation using edge-based LSTM networks. Offer an open-source SDK for telcos and microgrid operators to test with their nodes. Partner with hardware manufacturers to pre-bundle the offline AI load balancer into future-ready base stations.
Log in or sign up for Devpost to join the conversation.