Inspiration
I am an appointed member of the general board of The Hoogheemraadschap van Rijnland. This is one of the oldest water management authorities in the Netherlands, dating back to the 13th century. It's a regional water board (waterschap) responsible for water management in a region roughly between Amsterdam, The Hague, and Utrecht, including cities like Leiden (where I live) and Haarlem.
Its main responsibilities include:
- Managing water levels in canals, lakes, and polders
- Maintaining dikes and other flood defense structures
- Managing water quality and wastewater treatment
- Operating pumping stations and other water infrastructure
For these tasks we use a lot of power and especially the pumping stations are constantly operated to keep our region - on average 4m below sealevel - dry, and on the other hand sufficently hydrated for agriculture. Adapting to climate change, we need more dynamic water leveling in order to prevent floods or droughts, whilst making more use of renewable energy sources. For the pumping stations, this is basically an Energy Optimitization challenge, but very new to the waterschap, affecting asset management and maintenance scheduling. As a member of the board I’m interested in how to approach such a challenge, making use of knowledge graphs, generative AI and prediction modeling.
What it does
First, I build an Energy Optimizer model which takes into account the daily market price for power and the power profile of 3 pumping stations in our region (Katwijk, Gouda and Halfweg), I scrape the flow rate charts of the pumps as a proxy. The model optimizes power use via a reinforcement learning method rewarding more power use during cheap hours (e.g. when the sun is shining) and less during expensive hours. Maintaining water levels is being constrained by a balancing algoritm, with a penalty for over- or undershooting. The model is served in Modus (Hypermode) via the optimizeEnergy function, inputs are price and power profile for 24 hours, output is the actions profile, a percentage more or less power use with respect to the original chart whilst complying with water level constraints.
Second, the aim is to put PumpingStations and their Profiles, consisting of 24 hour values of current and historical flow rates, market prices and corresponding predictions (actions) to balance energy and water levels, using the Energy Optimizer model. This way we build a collection of PumpingStations with their Profiles, offering a vast knowledge base for predicting energy savings and planning maintenance, when we align asset management and maintenance projects later on. This is where knowledge graphs would come in handy, aggregating best performing PumpingStations with the least balancing issues and short maintenance windows.
How we built it
Since we are using Modus, the first effort is to have the model exposed as a function in Modus. The model is built in Python so we needed to make this available in the configuration. Unfortunately, the Model API for custom models did not work, so I set up a FastAPI server in pythonanywhere.com and connect to it via a config in modus.json. Inputs are consistent with 24 hour values, flows (proxy for power use) and market prices, output is a 24 hour actions profile, adjusting for a percentage of the flow profile (plus or minus with respect to the original power profile).
Learning from the example dgraph-101 in the modus-recipes repo, I tried to align this with my schema, but I ran into AssemblyScript and GraphQL issues (I'm quite new to dGraph...), so I got halfway in getting the graph working with the schema in dgraph cloud and reading it in Modus, but updating it via Python mutation constructs. I used Claude Desktop (Anthropic) quite a lot, saving me from frustration, helping me out constructing the dgraph statements. The aim is to serve the Modus functions in an MCP-server so Claude can do dgraph mutations for us.
Challenges we ran into
However the ease of running functions via Modus, there’s room for improvement on the Model API part for custom models not being served by Huggingface, the docs need examples. As a newcomer to both dgraph, GraphQL and AssemblyScript, it’s a steep learning curve for me getting all of this working in AssemblyScript. I’m sure I’m doing something wrong…
Accomplishments that we're proud of
I’m quite pleased with the way I used generative AI in assisting me through this hackathon, otherwise I wouldn’t have come so far. It’s a real life saver when you want to code new stuff you are not familiar with. This way I set up a reinforcement learning model in just a minute, before AI it would take me at least half a day. Using the tools of the configured MCP-servers in Claude, I had a real personal assistant helping me out writing files to my project directory and remembering the paths we had already taken.
What we learned
Building knowledge graphs is not an easy task and I need to be more illuminated in dgraph matters. It’s quite different from (No)SQL databases I’m used to. I see the opportunities for GraphQL functions exposing them via Modus and MCP, with respect to Neo4J’s Cypher syntax, because dgraph is really fast and the UI is a relief comparing to Neo4J…
What's next for Energy Optimization for Water Level Management
As mentioned, digging deeper into dgraph and building out a knowledge graph, connecting maintenance planning, projects and asset management to the pumpingstations will give my waterschap lots more insights in how to optimize energy and water management in the long run. Coming february, as a board we are going to vote on an important policy act regarding dynamic water level management. I aim to give my fellow member a presentation on my - by then improved - hackathon results and how we need to proceed on this important matter for the waterschap.
Built With
- anthropic
- assemblyscript
- claude
- dgraph
- hypermode
- mcp
- modus
- pydgraph
- python
- pythonanywhere.com
Log in or sign up for Devpost to join the conversation.