Inspiration

We believe in empowerment of and preserving privacy for vulnerable people in social need, who are now entangled in lengthy bureaucratic processes of local government, taking lots of personal data and -in the end- dignity and pride. We help people to minimize data disclosure in order to reap the benefits they're entitled to, without the feeling they need to go down on their knees in their struggle to make ends meet. And as a consequence cut government process steps to the bare minimum, saving tax payers money.

What it does

Using a personal data wallet our clients selectively disclose personal data using Ocean marketplace services and workflows in order to issue, load or verify credentials, whilst preserving privacy. No more lengthy forms to submit. Instant checking of verifiable credentials against eligible benefits. Straight-through processing of credentials using Ocean marketplace services in government application procedures. By using these services, we are able to collect anonymized transactions as input datasets in building AI models for predictions of top chosen benefits and services ratings, in order to help government invest in the right kind of services and benefits. This model and its inputs are to be published in the Ocean marketplace as free assets, for anyone to audit the (un)biased inputs and transparency of the model itself and account for the fact that it's built with public funding.

How we built it

By publishing qualifying services and workflows in an Ocean Protocol powered Marketplace, we can hook up a Vue.js data wallet by means of a simple QR-code-service as glue for data transfer of DIDs and webtokens in order to preserve privacy and use service endpoints according to Ocean OEP 8 metadata to disclose and verify information and issue verifiable credentials as well. Every piece of data in the wallet or an asset in the marketplace is modeled after a credential, published as a DDO (data asset) in the marketplace. We first publish a base credential DDO like some university degree format description using schema.org in the Ocean marketplace, issued by a public organization, as a template from which we instantiate -issue- timestamped credential objects to a client. This event will be published as a DDO of a Verifiable Credential according to the W3C standard, using OEP 8 service endpoints as a means to interact with this asset, like authenticate, read the accompanying data and verify. If service endpoints link to public documents, like in the case of the base credential DDO, we use IPFS to download files, lifting the burden of managing storage facilities. When service endpoints link to private or verifiable data, like in the case of a verifiable credential DDO, we use authentication services and separate DLT to get to the data.

Challenges we ran into

We understand that Ocean Protocol is primarily built for marketing and distributing data assets as fuel for AI models, with services, workflows and algorithms as work in progress. Basically our model is built around references to data, links to data using the service endpoint APIs of DDOs. These assets resemble a workflow or service type description as intended by Ocean OEP 8 v0.4. Preserving privacy, users can only consume the authenticated service endpoints to get to the real data.

Service and Workflow asset type

This means we rather use the Ocean Protocol powered Marketplace for distributing services and workflows, having data(sets) as input parameters, like a Base Credential. So we had to extend the commons github source code to accomodate for these asset types. Furthermore, to have a DDO registered we had to include the required file field, which doesn't make sense for these asset types. We built around this with some hack, while the real juice is in getting a DDO of a workflow, a sort of recipe with Ocean powered DIDs to know how to get stuff done. The actual work (issue and verify) is done in a separate API for now, however this could be handled by Ocean as well, when Ocean OEP 8 v0.4 is pushed to production stage.

Tokens? Not yet

As most of our services are free to use, we won't need any tokens yet, but we could introduce some sort of tokenization in order to actually use the endpoints by means of subscription tokens. This is something we need to investigate further. Of course, we can publish the AI model based on the actual use of the services the way as intended.

Accomplishments that we're proud of

Maybe not quite the intended use, we managed to build upon Ocean Protocol as a means to enhance privacy in an ever expanding data economy for a client base vulnerable to personal data exploitation. Thanks to Ocean Protocol we didn't have to reinvent the wheel of managing DDOs and blockchain based service execution agreements as we make heavily use of a services type of infrastructure. We managed to succesfully hack on the Ocean OEP 8 v0.4 spec to prove Ocean could help in the exchange of data for checking eligibility of social benefits. We therefore hope to make the world a better place for the socially weak.

What we learned

In search for implementation guidelines for W3C Verifiable Credentials and DIDs, we stumbled upon the DID method of Ocean Protocol. We found this method the most mature and extended with the metadata section of Ocean OEP 8 and 12 and decided to go along this route. We learned that for a bleeding edge standard like DID to implement you have to show some flexibility in order to make ends meet, therefore we drifted a bit off the (not so) beaten track of the intended use of Ocean Protocol. If our method and model scales up - and we're sure it will- we think we also help Ocean Protocol scale up in a way that perhaps hasn't thought of yet..

What's next

We'd like to test our model with an augmented Commons marketplace in our running pilots for the local government (cities in The Netherlands) and scale up with tokenization of service endpoints as the next feature. Building a Proof of Concept we will participate in the Odyssey hackathon in April.

Built With

Share this project:
×

Updates