Inspiration
Our inspiration came from a shared, universal frustration: the crisis of confidence in online shopping. We've all been paralyzed by choice, scrolling through hundreds of biased reviews, sponsored videos, and questionable star ratings. It's impossible to know what to trust. We wanted to build a tool that would fight this information overload—a single source of truth that could instantly give a smart, unbiased verdict on any product, so a user could finally buy with complete confidence. We didn't just want to build a review scraper; we wanted to build Veracity.
What it does
Veracity is an AI-powered decision engine. A user pastes a URL to any e-commerce product, and Veracity gets to work. In seconds, it delivers a comprehensive analysis dashboard that includes:
A master "Veracity Score" based on overall user sentiment.
A summary of Key Strengths and Common Complaints, extracted directly from the most opinionated and relevant sentences in user reviews.
A "Longevity Report" that specifically analyzes reviews for comments on the product's long-term durability and potential defects.
A "Video Review Digest" which finds, analyzes, and embeds the top YouTube reviews for the product.
A final "Value Verdict," where an AI analyst gives a qualitative opinion on if the product is worth its current price based on the sentiment data.
How we built it
Veracity was forged on a modern, multi-platform serverless stack, chosen for speed and scalability.
Frontend: The user interface was rapidly developed with Bolt.new using React and Tailwind CSS, and deployed globally on Netlify.
Backend: The entire analysis engine is a serverless Supabase Edge Function, which provided a stable, modern Deno/TypeScript environment.
Data Scraping: We used ScrapingBee for robust, initial scraping of the product page, and the YouTube Data API to find relevant video reviews. A dedicated transcript library was used to process video content.
AI & NLP: All text analysis—including sentiment scoring, thematic analysis, and qualitative summaries—was powered by the Google Cloud Natural Language API and Gemini.
Database: We used Supabase PostgreSQL for intelligent caching of results to ensure repeat analyses are instantaneous.
Challenges we ran into
Our journey was a trial by fire. Our biggest challenge was a series of catastrophic, cascading infrastructure failures. Our initial backend, built on a different serverless platform, was plagued by deep-seated dependency conflicts and module system incompatibilities (the dreaded CJS vs. ESM war), leading to constant, unsolvable build failures.
This forced a high-stakes, mid-hackathon pivot, where we migrated and rewrote the entire backend for Supabase Edge Functions. Once there, we faced and conquered a series of complex CORS security issues and API authentication bugs. Finally, the greatest challenge was one of quality—evolving the AI logic from a simple script that returned useless keywords into a sophisticated, multi-step engine that could deliver genuinely intelligent and meaningful insights.
Accomplishments that we're proud of
We are most proud of our resilience. Faced with total backend failure with only a few days left, we successfully executed a high-pressure pivot to a completely new platform and architecture without compromising our vision.
We are also incredibly proud of the final product's depth. We didn't just stop at text analysis. We successfully integrated a multi-modal system that also analyzes video content, providing a level of insight far beyond a standard review aggregator. Building a stable, multi-cloud application that uses five different APIs to deliver a single, coherent result in seconds is an accomplishment we will carry with us long after this hackathon.
What we learned
This project was an intense crash course in modern cloud development. Our key takeaways were:
Platform Stability is Non-Negotiable: A brilliant idea is worthless on a brittle foundation. Choosing a stable, predictable development environment like Supabase's was the decision that saved our project.
Use Industrial-Grade Tools: Don't reinvent the wheel, especially under pressure. Relying on a professional service like ScrapingBee was infinitely better than trying to maintain our own fragile scraper.
The Final 10% is What Matters: The difference between a simple script and a "magical" product is in the final, grueling hours of refining the quality of the output. Pushing for true thematic analysis instead of settling for keyword extraction made all the difference.
What's next for Veracity - End of Buyer's remorse
The version we built for this hackathon is just the beginning. Our vision for Veracity is to become the indispensable companion for every online shopper. The roadmap includes:
Expanding Data Sources: Integrating discussions from Reddit, Twitter, and dedicated product forums to create an even more comprehensive "Veracity Score."
Price History & Deal Alerts: Tracking a product's price over time to tell users not only what to buy, but when to buy it.
User Accounts & Personalization: Allowing users to save their analysis history, track products, and get alerts.
Browser Extension: A seamless extension that provides a Veracity analysis directly on any e-commerce page, making confident shopping just a single click away.
Brick-and-mortar Adaptation: We envision Veracity to be the perfect companion app to your in-store shopping experience. Just scan the barcode of the product that you are about to buy and Veracity will scour the internet to give you a detailed public sentiment analysis of the product.

Log in or sign up for Devpost to join the conversation.