Inspiration
AI is on a fast track to change our society as we know it. Every most popular consumer app we interact with everyday (Facebook, Google, SnapChat) contains at least one AI feature. Companies are already using intelligent agents to detect malware, to prevent money laundering and automate claims process at insurance firms.
If 2016 is considered as an experimental year when researchers all over the world experimented with training different AI models, 2017 is when AI goes into production, with new use cases discovered in almost every single industry.
Upcoming use cases for AR, VR, smart homes, smart cities, or Internet-of-Things will further propel the volume of AI computations.
Problem #1: Non-democratic development of AI and wealth inequality Big corporations are in an arms race to build data centers and accelerate AI computations from their clouds to serve this big and growing market. The worldwide public cloud services market is projected by Gartner to achieve $246.8 billion in 2017, with growth largely driven by AI. This very profitable market is largely controlled and owned by a few selected large corporations (the usual suspects). By “controlled” I mean these big corporations are in an almost monopoly position to shape the evolution path of AI in our society. And by “owned” I mean these same corporations reap the majority share in financial rewards from our society transition to an AI-driven future; because these companies are the main providers of the “cloud” as we know it, the default computation infrastructure for AI.
Problem #2: Latency in AI computations inhibiting both Dapps and apps with real-time AI functionalities. The current cloud-based model causes latency because the data centers, or the “cloud”, where computations are performed, are located very far from end-users’ devices. Geographical distances, round-trips as data have to travel from the local device to the cloud and back, network congestion, and signal collisions make the default cloud-based model not suitable for upcoming applications of AR, VR, and IoT.
Problem #3: Lack of a common communication protocol among intelligent agents AI agents run on Google Cloud, on Amazon AWS, on Microsoft Azure don’t talk. They live on different clouds and are built using different frameworks from Google, from Amazon, from Microsoft that don’t naturally interface. A well-functioned autonomous world needs a common protocol for intelligent agents to communicate and exchange insight in a trustless way.
What it does
ChainIntel = A decentralized AI computing platform + An AI Algorithm marketplace + A standard communication protocol for intelligent agents
A decentralized AI computing platform. ChainIntel moves AI computations away from the cloud and closer to end users; hence, our tagline is the World Intelligence Delivery Network. ChainIntel dynamically transports, replicates, and distributes AI computations across edge nodes that are closer to end-users where input data is generated, reducing latency by avoiding bottleneck round-trips to cloud servers, ensuring fault tolerance and privacy of user data. There is no central data centers where all AI algorithms are computed and where all user data are stored and processed. Devices (or edge nodes) trade computing resource with one another in real time through smart contracts in exchange for C-IQ tokens so AI computational load can be distributed.
An AI Algorithm marketplace. ChainIntel has a built-in deployment platform for AI models that facilitates deploying AI models of any framework or any language using a common set of instructions. A built-in optimizer automatically takes care of computation routing, AI distribution across edge nodes, and flexible scaling of computations based on real-time traffic. Through the ChainIntel AI Algorithm marketplace, Dapp or app developers can easily discover and integrate the right AI solutions for their apps. AI developers trade their AI algorithms for C-IQ tokens through smart contracts that are triggered whenever there’s a request to an AI algorithm.
A standard communication protocol for intelligent agents. Leveraging the aggregation of AI algorithms and intelligent agents that make use of these AI algorithms on the ChainIntel platform, ChainIntel provides a set of standard APIs for querying information among intelligent agents. Access permissions are regulated by a set of smart contracts and exchanged information is recorded on an immutable blockchain for audit trail purposes.
How we built it
We organize independent edge nodes into clusters in which AI algorithms are distributed and replicated for parallel processing and fault tolerance. A real-time optimizer manages the computational load among clusters by routing requests to the closest clusters to achieve the fastest response times.
AI models are saved on IPFS for fast and secure access. Input data is split across nodes for processing to ensure data privacy. libp2p is used for peer-to-peer communication among nodes in the ChainIntel network.
The Ethereum blockchain & smart contracts serve as the transactional and permission/access tracking layer to reward nodes for processing AI computations.
AI models are preprocessed and converted into binary or 8 bit network for faster execution time. This method is elaborated in this paper authored by our team member and Technical Lead, “Embedded Binarized Neural Networks” link.
AI model computations are further optimized by discovering potential early exits as described in this paper “BranchyNet: Fast Inference via Early Exiting from Deep Neural Networks” link, authored by our team member and Technical Lead.
AI models are distributed across edge nodes according to methods described in this paper “Distributed Deep Neural Networks over the Cloud, the Edge and End Devices” link, also authored by our team member and Technical Lead.
Presentation
Check out our presentation deck here: https://drive.google.com/file/d/0B9OVZyM96xH-SFR6Z3VBOVl3TUE/view?usp=sharing
Built With
- ai
- deep-learning
- ethereum
- html5
- ipfs
- javascript
- libp2p
- raiden
- scss
- solidity
- ts
- web3

Log in or sign up for Devpost to join the conversation.