Inspiration
Been working on a RWA tokenization system for the last two years and during the beta testing some issues were very consistent where we had to involve a human valuer, such issues can only be well addressed by a standardized Tokenization and minting process guided by AI for the purposes of consistency and accuracy. These issues can further be itemized as;
Property valuation today is slow, manual, and inconsistent. Valuers spend days gathering comparable data, analyzing documents, and drafting reports. Financial institutions and regulators demand traceability and audit trails, but most valuation processes are fragmented across spreadsheets, emails, and PDFs. Tokenization of assets — while promising — lacks trusted valuation data to back each token.
What it does
Our AI-powered platform transforms how real-world assets are valued and tokenized. Using Elastic’s hybrid search, it instantly searches across property records, comparables, images, and documents—combining semantic intelligence with exact keyword accuracy.
Powered by Google’s Vertex AI and Gemini, the system understands natural language, reasons through valuation scenarios, and provides conversational insights like a professional valuer. It automates the heavy lifting—data discovery, comparable analysis, first-pass valuations, audit trails, and draft reports—while keeping human valuers in control for approvals and compliance.
Once a property is validated, our platform can seamlessly mint real-world asset tokens directly to the owner’s account.
In short, it’s a smarter, faster, and audit-ready valuation and tokenization platform—bringing together Elastic search, AI reasoning, and blockchain to redefine property intelligence.
How we built it
We built the platform by combining Elastic’s hybrid search capabilities, Google Cloud’s Vertex AI and Gemini models, and a blockchain tokenization layer to automate property valuation and real-world asset tokenization—while keeping humans fully in control of final approvals.
Challenges we ran into
Standardization Challenge: During testing several Property records came from multiple sources (GIS systems, valuers, land registries, PDFs, Excel files) with inconsistent formats and missing data. which send us back to the drawing board.
** Magnitude of Work** This project is bigger and larger than we had earlier anticipated, the input is immense, coming from a decade plus experienced engineer. The TokenSight Elastic Applicatiion, is a large system, we have had multi layered integration challenges, working with web 3, automating the minting process, integrating the Google Vertex AI were and are some of the challenges we are still facing, and so, we have spend so much time in testing and debugging, so we decided to work on a prototype and showcase before time alocated for this project is over.
The Hybrid search is not plug and play Combining keyword based Elastic search with dense vector embeddings from Gemini isn’t straightforward.
Valuation standards as per AI reasoning Sometimes the AI ignored some standards, but as earlier stated we believe we have a long way to go to polish this area.
** The tokenization minting automation is not a walk in the park** yet we mnaged to work out a prototype for clarity
Time As per the given period the development of the project that we choose would require more time and several senior engineers
Onboarding of engineers that understand Tokenization was a challenge Reached out to several people, but almost all of them do not have a clarity on tokenization and minting of RWAs.
Accomplishments that we're proud of
The whole set up, the clarity of our architecture is solid, We also managed to have a simple working prototype so that we can clearly shocase our idea.
What we learned
This project taught us that AI can accelerate valuation and tokenization, but real success comes from: Clean data Human oversight Legal and technical interoperability Transparent and explainable AI
Also taken note that: There is alot of fine tuning that is needed in our system We need more good data for modelling the AI We need more engineers
What's next for TokenSight Elastic
Growth, Onboard more Senior engineers, and VCs Because of the market opportunity Market Opportunity (as per Mackensey) Global real estate tokenization market projected to surpass $16 trillion by 2030. Valuation & property data services currently worth over $3 billion annually. Major demand for AI-driven compliance and audit tools in fintech and banking.
Our targeted users Professional Valuers – Automate data collection and generate draft valuations faster. Financial Institutions – Use audit-ready valuation data for lending and compliance. Developers & Property Firms – Tokenize verified properties for fractional investment. Governments & Regulators – Get transparent, traceable asset intelligence.

Log in or sign up for Devpost to join the conversation.