I worked with BigData clusters and during that time we used to value storage much. Should't we value it even more in case of blockchain? Blockchain is bloating, fee per storage per year is ahead. There is a beautiful idea of columnar storage (Google BigTable, HBase, Cassandra, ...) and this project is inspired by it, particularly by Apache Parquet and Protocol Buffers.
What it does
Optimizing smart contract storage slots usage for structured data. It'll save every full node disk space and your fees.
How I built it
I've implemented well known ideas and techniques in solidity.
Challenges I ran into
It was't feasible to implement more sophisticated techniques (like dictionary coding, delta encoding) during hackathon because it'd require to cache and preprocess many objects, which is in turn difficult due to lack of adequate memory allocator.
Accomplishments that I'm proud of
Space economy is 3x. I managed to create and test full cycle of work with columnar data, to create code generator, to measure performance gains.
What I learned
There is a wide area for optimization of current approaches to Ethereum smart contract development. I expect to see advanced memory allocator, rich solidity standard library, performant storage solutions in the near future.
What's next for Ethereum Columnar Data Storage Format
Further work is outlined in repo: search by hash, optional fields, unlimited fields and so on.