## Inspiration
"Borrowing Against NFTs Is Now a $1 Billion Industry" - according to Dune Analytics and Decrypt. There are a few reputed protocols in Ethereum blockchain that facilitate this service like NFTfi which facilitated more than $390 million, BendDAO boasts nearly $298 million, and Paraspace has already hit $236 million. The number of cumulative users has also soared well above 40,000. NFT backed lending protocols flourish mainly in metaverse where finance is involved - e.g. buying in-game assets, investing in liquidity pools to earn APY, earn staking rewards from game pools and in DeFi space for liquidating NFTs to increase returns . NFTs and Metaverse is one of the major domains in the entertainment and gaming industry. There are a number of developments that are going around around these domains in Polygon Network which makes "GainX" a high utility application for the community and the ecosystem. GainX can open new pathways for in-game finance (e.g. lending, borrowing, staking). Despite these amazing opportunities, Polygon ecosystem do not have any such lending protocol for their users. Hence, we came up with the idea of GainX: 1st reputation based P2P NFT-collateralized lending protocol governed by a DAO in the Polygon ecosystem.
What it does
GainX aims to solve the common problems faced by any borrower in the metaverse/gaming or a user wanting some liquidity to earn leverage. The 4 main problems it solves are:
Over-collateralized loans are the most common type of loans in today's market. But not everyone has so much liquidity or worth of assets. To solve this problem among the users, GainX sets up a reputation mechanism. The reputation majorly depends on transaction history, loan repayment history and staked amount in GainX liquidity pool. Also, a borrower can stake money to our protocol to leverage under-collateralized loans in some cases.
Hedging and risk management through derivatives. NFTs and cryptocurrencies are very volatile at this point of time and need advanced risk management and tolerance mechanisms. Derivatives are an effective solution for this problem. Futures contract can be used to lock an APY return from the borrower and a Put Option can be used as an insurance for steep valuation drop in NFTs due to any reasons.
More accurate and scalable machine learning model that can predict future prices for NFTs. State of the art model uses Recurrent Neural Network as base architecture which has proven results in various industrial applications .We also have solution of using NEURAL PROPHET forecasting architecture by META . It uses combine power of both statistical algorithm and Neural Network architecture to make forecasting more generalized and robust It uses past NFT transaction as feed data and compute all underlying pattern . This process is training . Once the model is trained , it is now set to make predictions of NFT to any upcoming number of months . Provided sufficient data we can also analyze the future market condition trends such as CPI , NFT index which aim to facilitate valuation, portfolio tracking, lending and borrowing and collateralisation of non-fungible tokens.
We have used tensorflow google framework to architect the Recurrent Neural Network and Adam optimiser to optimize the model which uses a stochastic gradient method to update weight of neurons. We have used the NEURAL PROPHET module managed by META in order to make more optimized prediction
4.Our sentiment analysis model utilizes the state-of-the-art machine learning techniques and Random Forest Classifier algorithm, to accurately classify the sentiment of news articles within the NFT lending market. By leveraging the ensemble-based Random Forest Classifier, our model is capable of effectively capturing intricate patterns and relationships within the article's features to make accurate sentiment predictions. Preprocessing and extracting text from news article URLs are crucial steps in capturing sentiment. Preprocessing techniques like tokenization and lemmatization enhance data quality and enable accurate sentiment analysis in the NFT lending market using machine learning.
- Flexible offers between peers like P2P lending with lending security like an AMM. Initiating an offer is very simple and interactive between two users. This takes place like a P2P lending protocol system. But once the borrower repays the loan, the lender is not obliged to redeem his tokens and cash out his earnings. After an offer ends, 1 week cooling period is provided after which a lender might choose to let the protocol invest their funds in the liquidity pool which will give them additional APY until they cash it out. The calculation of the pool APY is the ratio of the total worth of NFTs our escrow is holding to the Total Value Locked (TVL) in our liquidity pool by the lenders per block.
How we built it
There are 3 major domains which we have worked to make this product best suitable for the market and be a utility to the community.
Blockchain and Polygon Chain: The deals made, derivatives and position management, lending are all automated and secured using smart contracts. All the commutations are made on a chain to ensure data integrity and transparency among the dealers. Futures and options contracts are maintained using smart contracts to avoid any middle man and make the system as trustless as possible. For the smart contracts we used Solidity and Hardhat along with Openzeppelin for secure smart contracts. The Polygon chain powers our application with high speed transactions and low gas price best suited for scaling and mass adoption. It makes the user experience very seamless and fast compared to other blockchains.
AI/ML Model: A time series prediction using a Recurrent Neural Network (RNN) model. It fetches NFT price data from an API, preprocesses the data, trains an LSTM (long short-term memory networks) model, and predicts the price after a specified period. The current implementation utilizes stochastic optimization algorithms to train and optimize the model for accurate predictions. A Sequential model is created using Keras, and an LSTM layer is added. The model is compiled with the mean squared error loss function and the Adam optimizer. The model is trained using the training data. Designed a Neural Prophet architecture using module by META in order to have more optimized and accurate forcasting result .
Refer this to understand more: https://bit.ly/3MtZVSA (Documentation)
Our sentiment analysis model utilizes a labeled dataset of NFT and crypto-related news articles to train a Random Forest Classifier. The dataset includes features like date, subject, title, URL, and article text, along with sentiment scores. We preprocess the text data using the industry standard NLP algorithms like tokenization, stopword removal, and lemmatization, and then extract informative features like word frequency and n-grams. The Random Forest Classifier learns patterns and relationships between features and sentiment labels. The trained model can then predict sentiment on new articles. Refer this to understand more: http://bitly.ws/GH9n (Documentation)
Tokenomics and APY calculation: Our application takes a hybrid approach in order to preserve the flexibility of offer in P2P and the investment security like AMM. The APY of the liquidity pool will be determined by the ratio of the total worth of NFTs our escrow is holding to the Total Value Locked (TVL) in our liquidity pool by the lenders per block. This will provide an APY which takes into account factors like demand of the protocol, lenders' and borrowers' sentiments.
Challenges we ran into
We lacked data, due to which we needed to reconsider model architecture . Expected data was a major challenge we ran through . Moreover rigorous tests were also a challenge to ensure that the outcome of our model should be more accurate. But to counter this problem we devised a ML model that would give accurate results despite that and we were able to achieve it. Our model works close to accurate and shows impressive results. We need more data to analyze the neural network architecture to make model more generalized . Neural Prophet dependencies mismatch was a big challenge as in newer version of dependencies do not sync with older one .
We were not very familiar with Tokenomics and therefore had to brainstorm a lot to manage the APYs and the liquidity. After a lot of studying we finally could find a more user governed, demand and supply and sentiment based calculation that will provide a fair valuation for all the investors (APY) and the borrowers (APR).
Training a sentiment analysis model with a dataset containing over 5000 news article URLs poses several challenges. The primary challenge is the computational effort required to parse and preprocess such a large dataset. Processing a large volume of text data necessitates substantial computational resources, including memory and processing power. Though this can enhance the model's generalization and performance on unseen data, the computational effort required for preprocessing, such as tokenization, stopword removal, and lemmatization, increases significantly.
Accomplishments that we're proud of
The fact that we were able to built the prototype with a good UI makes me feel really proud. There were a number of times when things went wrong and we had to brainstorm our way out. We worked in a relatively high pressure situation and were happy with the output of our efforts. The team took full responsibility for their work and showed excellent team work during the entire span of the hackathon. Also, the fact that we have built this product keeping in mind the demand of NFTs in metaverse and DeFi in Polygon network, makes us feel good about it. We have carefully figured out the necessity of the users and tried to devise a solution that was both practical and more incentivizing for both the borrower and the lender making it a utility protocol for the community.
Some notable technical accomplishments:
Accurate NFT price estimation neural network considering unconventional parameters like market sentiment, trade volume, active users, floor price, sales and many more.
Chainlink Automations for giving our smart contracts the security and power to execute functions with timers for insurance, hedging and loan repayment.
Frontend and design of GainX is built to best suite the Web3 community and users with pleasant and easy to use user interface and experience.
What we learned
For this project I needed to learn about the current market of lending and borrowing. The technology and mechanism involved. Also, I deep dived into P2P lending and Automated Market Maker protocols.
I learned about stocks, bonds, futures and options, basically about the finance world and their mechanism. I was really amazed by all the concepts and happy that I could improve the traditional systems with the help of blockchain and Polygon.
What's next for GainX
Generalization and optimization of the ML model architecture by changing the Statistics gradient algorithm with the Bayesian algorithm in order to give the model capacity to tell when the prediction made the model is not accurate to increase the robustness of the model. As the user continue to use our platform we will accumulate more data and use those data to make model more accurate . Hence model will be able to learn automatically from the data accumulated as user interaction increase .
In the future we are planning to expand this project and add more collections support to attract a wider range. In this way the machine learning model will have more data to process over and more accurate results.
Continuous integration with Opensea marketplace to update data in realtime for covering more metaverse projects and ecosystems.
Fine-tuning and Optimization of the Sentiment analysis model for Enhanced Risk Assessment using real-time market data and discord and an expanded feature-set.
Moreover, we plan to get dedicated cloud-based computational resources on AWS. Dedicated cloud servers like AWS provide scalable resources, enabling efficient processing and training of large ML models by leveraging distributed computing power, reducing computational bottlenecks, and managing memory requirements.
We are also working on Artificial intelligence model which will use latest Generative Adversarial Network ( GAN ) and Large Language Model (LLM ) convolutional neural network (CNN or ConvNet) along with the current model to accurately identify the performance of even those NFT which are current not for sale .
Built With
- artificial-inteligence
- flask
- machine-learning
- neural-network
- nextjs
- openai
- python
- rapidapi
- solidity
- typescript
Log in or sign up for Devpost to join the conversation.