NeuraNFT: Tokenizing Intelligence

Inspiration

The inspiration for NeuraNFT stemmed from a personal desire to customize AI models to better suit individual needs. As I delved deeper into the AI landscape, I uncovered a significant market gap that revolved around the value distribution among key stakeholders in the AI ecosystem. This ecosystem comprises four primary stakeholders:

  1. Data Owners: Individuals or organizations that possess valuable data sets crucial for training AI models.
  2. Model Owners: AI researchers and developers who create and refine AI algorithms and architectures.
  3. Hosting Platforms: Infrastructure providers that offer the computational resources necessary for training and deploying AI models.
  4. End Users: Consumers or businesses that utilize AI models for various applications.

While each of these stakeholders contributes essential value to the AI ecosystem, I observed that the current market structure often fails to equitably distribute the benefits and rewards. For instance:

  • Data owners often have little control over how their data is used and rarely receive direct compensation for its value.
  • Model owners may struggle to monetize their innovations effectively, especially if they lack the resources for large-scale deployment.
  • Hosting platforms, while essential, can sometimes extract disproportionate value due to their control over infrastructure.
  • End users might face high costs or limited customization options, reducing the accessibility and applicability of AI technologies.

This realization led to the conception of NeuraNFT, a platform designed to rebalance the value proposition for all stakeholders involved. By leveraging blockchain technology and NFTs, NeuraNFT aims to:

  • Provide data owners with greater control and potential for monetization of their data.
  • Offer model owners a direct path to market their innovations and receive fair compensation.
  • Create a more competitive and decentralized hosting environment, potentially reducing costs and increasing efficiency.
  • Empower end users with more affordable, customizable, and transparent AI solutions.

Through this approach, NeuraNFT seeks to democratize AI technology, fostering a more equitable, innovative, and collaborative ecosystem that benefits all participants in the AI value chain.

What it does

NeuraNFT is a blockchain-based AI system built on the Internet Computer Protocol (ICP). It addresses the growing need for personalized, secure, and decentralized artificial intelligence by combining the privacy and security of blockchain with a decentralized HPC infrastructure to deploy machine learning models securely.

At its core, NeuraNFT leverages Non-Fungible Tokens (NFTs) to represent AI models, including their fine-tuning data, architecture information, and training parameters. This approach enables true ownership of AI models, allowing users to buy, sell, or rent them in a decentralized marketplace. The system utilizes a network of HPC nodes for model training and inference, ensuring high performance while maintaining decentralization.

NeuraNFT also prioritizes user privacy by implementing advanced cryptographic techniques for computations on encrypted data. The platform incorporates tools for customizing and fine-tuning models, as well as an ethical AI framework to promote responsible development. With its scalable infrastructure and interoperability features, NeuraNFT aims to create a more accessible, innovative, and fair AI ecosystem for all stakeholders involved.

How we built it

We built NeuraNFT using a combination of blockchain technology, specifically the Internet Computer Protocol (ICP), and high-performance computing infrastructure. The system utilizes smart contracts (canisters in ICP) for managing NFTs, user authentication, and interaction with the HPC nodes.

For our AI model backend, we used Ollama to work with LLaMA 3.1 as our base model. We chose LLaMA 3.1 for its powerful language understanding capabilities and Ollama for its efficient management of large language models. To ensure scalability and ease of deployment, we containerized our Ollama setup using Docker.

We implemented a prototype using a single HPC node for AI model training, fine-tuning, and inference, with plans to transition to a fully decentralized, blockchain-based compute network in the future. This approach allowed us to demonstrate the core functionalities of NeuraNFT while laying the groundwork for a more distributed system.

Challenges we ran into

One of the most significant challenges we faced was implementing the HTTPS outcalls feature of ICP. This proved to be more complex than initially anticipated:

We had tried various approaches, including HTTP calls, unsecured HTTPS, localhost, HTTPS without a domain, and HTTPS with a domain but without proper SSL. None of these worked due to ICP's strict requirements. We even purchased a domain, generated an SSL certificate, and hosted it on our home server, but it still failed. After further investigation, we realized that ICP requires IPv6 connectivity, which our home network (using IPv4) doesn't support. We've learned that ICP needs HTTPS-secure calls, SSL with a proper domain, IPv6 enablement, and can't use localhost or IPv4 addresses. We concluded that ngrok might work as it provides HTTPS connectivity, which aligns with ICP's requirements.

A big thanks to the ICP team for helping us out with this challenge.

Another challenge we encountered was in hosting the AI model efficiently. Initially, building the Docker container repeatedly caused reinstallation of packages, which was time-consuming and data-intensive. To address this, we implemented a package caching strategy. This approach significantly reduced build times and data usage, ensuring smoother and more efficient running of the container. This optimization was crucial in maintaining a responsive and resource-efficient development environment.

Accomplishments that we're proud of

One of our major accomplishments was learning and implementing new technologies from scratch, particularly ICP and Rust, which have steep learning curves. Iterating over various architectures was a rewarding process, and writing our first litepaper gave us a sense of pride and accomplishment. We're proud of creating a system that has the potential to democratize AI ownership and usage while maintaining high standards of security and privacy.

What we learned

This project was a significant learning experience. We learned how to approach problems from multiple angles when faced with obstacles, particularly with the API calls issue. We gained proficiency in Rust programming and learned how to deploy models like LLaMA on Docker. The process of writing a litepaper also taught us how to articulate complex technical concepts in a clear and concise manner.

What's next for NeuraNFT

The future of NeuraNFT is exciting and ambitious. Our roadmap includes:

  1. Transitioning to a distributed HPC network for improved scalability and performance.
  2. Developing a fully blockchain-based compute system for AI model execution.
  3. Enhancing NFT functionality to allow for model composition and fractional ownership.
  4. Creating a decentralized marketplace for AI model NFTs.
  5. Implementing a DAO for ecosystem governance.
  6. Improving interoperability with other blockchain networks and AI systems.
  7. Enhancing privacy and security features, including homomorphic encryption and differential privacy.
  8. Developing an ethical AI framework and compliance tools.
  9. Expanding AI capabilities to handle various data types and enable continuous learning.
  10. Focusing on real-world integration, including IoT device integration and enterprise-grade solutions.

We're committed to positioning NeuraNFT at the forefront of decentralized AI technology, creating a robust, scalable, and innovative ecosystem that revolutionizes how AI models are created, owned, and utilized in a decentralized world.

Built With

Share this project:

Updates