Bitcoin historical price with its relevant news. This is manually curated for new users who wants to learn what moves the crypto price.
Wallet explorer. This utilize Pagoda API.
TVL of selected chain. By default select Near chain. Covering dozens of other chain. We're using Footprint Network service
Whale explorer of NFT marketplace to learn the list of whales from other marketplace. This utilize Pagoda API.
List of coins and and list of total value of the wallet. This utilize Pagoda API
List of NFT held by whale. This use Pagoda API.
Exchange inflows and outflow to learn which exchange has volatility. We're using Footprint Network service.
Crating ETH indexer, however we're run out of storage since the ETH data contains almost 13TB and SSD is required.
We're trying to load the data from archival. Unfortunately it takes around 22-30 days to finished. We utilize AWS for this computation.
I have two perspectives that I can share:
1.As Lead Product at Paras NFT Marketplace
I want to see how our top 10% of users' spending behavior:
- Where and what they spend
- Which other marketplace do they use
- Is there any extensive collection which not listed at paras but on other marketplaces has.
- How many traders vs. collectors, etc.
Querying manually is tedious, and I want to be alerted if other marketplaces list a collection that attracts volumes we don't have. My goal is to get ahead of the market by learning what our whale and competitors are doing, so we know what to do next.
2.As Crypto traders
I lost more than $8K in my first year of trading; after the past 3 years in crypto, I learned that to win high-risk trading, I need to know the following:
- What is the correlation between macroeconomic and crypto prices (interest rate, the Fed decision, inflation)?
- When is the upcoming macro event?
- How's the market sentiment? Is there other important news, such as FTX and LUNA being insolvent?
- What inflow/outflow of DEXs
My goal as a trader is clear; growing my assets.
To do this, I use multiple products such as Nansen, Pikespeak, and Whalestats. Here's what I learned:
Pikespeak is a web3 product analytic focused on Near. There are 2 main reasons why I don't like them:
- Terrible onboarding. I was furious because they do not provide the trial mode; you need to pay $100 to try it.
- Unclear user segmentation.Their insight was focused on traders, but they approached us as a product manager; I wanted to learn about hundreds of wallets at once and get a summary of them. For example, they only allow one input for wallet explorer instead of CSV file input.
Nansen.ai is a leader in web3 analytics and has excellent capabilities. However, they are currently unavailable on Near, and their user segmentation is unclear; $150 per month for the basic premium plan is enormous for retailers.
What we're building
After talking with dozens of retailers, we decided to build Profond.ai. Profond.ai provides deeper web3 insight for traders and enterprises to dive deeper and get ahead of the market.
- For Traders: We provide the correlation between macro events and news with Bitcoin price. Traders also can look at market historical data trends, find undervalued assets (like where to find the cheap NFT), and learn how and what the whale holds, which is believed to be growing assets.
- For Enterprises: Marketplaces like Paras, Mintbase, and Few & Far need to understand their users spending behavior. With Profond, they will get a summary of how their user's wallet performs, where they mostly spend, which collections, and when. By this, the marketplace can understand how its competitors and users behave and plan the move to get ahead of the market.
How we built it
- First iteration - Building data from archival: we're using data archival from NEAR, put on our AWS. But it took around 22 days to unpack from rocks DB to Postgres. We're gonna make it in time for Metabuild, but the progress is still ongoing for our further development. For EVM, we're using BigQuery to pull the data. This process took around 2 days.
- Second iteration - Using public Postgre: We often face timeouts, and the latency was sluggish. We're even reaching Bohdan and Tiffany (the NEAR data team) for private Postgre, but timeouts still became our concern.
- Third iteration - Using Pagoda & Footprint Network: To make the development faster and simpler for Metabuild, we've decided to use Pagoda and Footprint Network 9 days before the submission.
Challenges we ran into
- Operational cost: Providing real-time data analytics is expensive. We somehow managed to reduce the ops cost by requesting a trial from AWS and GCP.
- Data ops: Since we run on a limited budget, we're doing trial and error with several open-source tools before finally understanding their limitations and ending up using AWS + GCP.
- The process needed to read all Rocksdb blocks exceeded our expectations. It took 22-30 for Rocksdb to complete. We overcame this by using Pagoda.
Accomplishments that we're proud of
- We got supported by AWS by giving us $10K credits by joining AWS Activate.
- Building a company in the US was easier than I thought. We've decided to take this product further by making an established company, which is seamless. It only took 20 days to get EIR at a low cost.
- We're building end-to-end products in just 3 weeks. After weeks of research on data acquisition, we just started the development on the first of November while seeking help from AWS for credits. Luckily, we get $10K on 7 November.
- I just found that I like building products and teams. It's like connecting the dots because I knew everyone on my team a while ago and united because of this project. This is fun!
- Our early adopters love our concept. We interviewed them from several marketplace discords and found that we love our MVP. After Metabuild, we will keep building Profond to solve their problem.
What we learned
- Data post-processing is time-consuming.
- Customer segment
- It's easy to build
What's next for Profond Blockchain Analytic
- After the archival unpack, we will migrate our data from Pagoda to AWS. We will consider if Pagoda has a plan with us.
- Go to market, we will keep building the product and release it to the market within this year.