DataDAO mission is to allow the pooling of datatokens into a meaningful and valuable dataset who’s value is greater than the sum of it’s parts alone. Laying the foundations to a fairer and more inclusive value distribution in any product or platform, to the members who actually generate it. It also serves as a demand generating tool — anyone in need for specific data set can source this data from the community, and the community can join forces together to create it. DataDAO grows Ocean Protocol data marketplace by expanding it from being a peer-to-peer, or one-to-one data marketplace, into a many-to-one data exchange protocol. This is how we can foster demand on OceanMarket! Compiling together elements from DeFi, decentralized governance and data marketplace, we believe all the pieces that are needed to bring such a product to market exist, and with the right design, technology stack, and crypto economics / incentive mechanism, we can showcase the value such a tool can unlock.
What it does
DataDAO utility is similar to the narrative of an AMM (automated market making, like uniswap, bancor, balancer etc…) for liquidity pooling - the value of each isolated individual liquidity is not valuable in it’s own, but the ability of them to pool together unlock a tremendous amount of value in the form of large token exchange liquidity pools. Same applies for DataDAO, instead of Capital → Data. - the pooling of fragmented data together can create a valuable dataset.
Phase 1 - Dataset request Data Requestor ask for information (i.e - 500 twitter threads and discussion around ocean and the web3 data economy, json format ) and stake 50$ DAI. Upon submission of this request, a DataDAO ‘OL72’ is opened - a DAO instance is created on DAOstack.
Phase 2 - DataDAO contribution phase Contributors provide links for this request. Effectively they are minting and sending datatoken into the DataDao vault. In return the DAO sends DataPool tokens which represent the contributor financial stake in the Datapool, and REP which represent voting rights in the DAO that governs the combined Dataset
Phase 3 - the dataset is ready Data requester with a metamask authentication buy access to the combined dataset, and receive all of the Datatokens held in the DAO.
(Note - In future stages there will be the possibility to appoint a reviewer / verifier / keeper that will serve as a validation step to ensure quality of data, propose which contributions are valid and which are not, and add to the confidence level of the data requestor. Further info on this will be published later on)
Phase 4 - Data providers redeem their holding from the DAO and can now get their proportional claim on the DAO treasury against sending their DataPool tokens and slashing their REP.
Note - in future stages DAO members can propose and vote on decisions that relate to the dataDAO, such as using the proceeds to further grow the dataset, bring more buyers or improve the quality of the data through proposals like data clean-up, image annotation etc...
How I built it
- Frontend build using React + Redux
- Smart contracts implemented using Truffle + OpenZeppelin + Drizzle
- Interaction with the Ethereum blockchain using Web3.js, Notify.js and Web3Modal
- Interaction with OceanProtocol using ocean.js and @oceanprotocol/react components
- DAO management using DAOStack Arc framework
- DAO metadata management using IPFS
- Integration with Filecoin using Textile Hub, so people can use a permanent+cached decentralized storage solution for uploading Individual data contributions
Challenges I ran into
How to make sure a contributor is not revoking the data from the url where he originally stored it How to structure the DataPool token / MasterDataToken in the most efficient way, as each datatoken contributed by a user is a unique token
Accomplishments that I'm proud of
Building a structure that allows a fully permission - less structure in the process of creating a combined dataset from multiple individual contributions. The structure chosen ensure that at any given point in time, no member is able to hold the entire dataset, and the only point in time it does happen is when a buyer pay to buy the combined dataset Lot’s of interest from the community. We kepts the project in stealth until submission, so our discord and telegram channels are just starting, but an example reference that indicates there is a clear path to ensure large-scale adoption: https://twitter.com/dennisonbertram/status/1341493651143352321?s=28
What I learned
- By doing the research as part of building this POC, we were amazed by the number of applications that can use DataDAO concept - AI, ML, curation lists, investment
- Used ERC-998 for deploying MasterDataToken
What's next for Data DAO
- Use ERC998+ERC1155 standard on our MasterDataToken for improving user experience
- Enable a more fine-grained access control policy for individual data contributions
- Selection of a validator as a function to increase confidence in a DataDAO dataset and reduce spamming
- Include staking function – for a dataDAO initiator and for contributors
- A DataDAO governance – iterate and deploy the dataDAO governance scheme