🧠 Veriscope — Truth in Real Time
(wordplay on periscope) - Transcribes, Flags, Verifies
Hear it. Check it. Know it. A real-time Chrome extension that listens to any video and instantly shows supporting evidence for what’s being said — powered by AWS Bedrock, Amazon Transcribe, and Exa AI.
💡 Inspiration
In a world flooded with information — and misinformation — truth often lags behind speech. During live debates, news coverage, or roundtables, claims fly by faster than anyone can verify them.
We asked ourselves:
“What if truth could keep up with speech?”
That question became Veriscope — a tool that helps people see the evidence behind statements in real time, without bias or delay.
🔍 What it does
Veriscope is a Chrome extension that performs live fact aggregation for any online video.
While you watch, Veriscope:
- Listens to the audio in real time
- Transcribes speech using Amazon Transcribe
- Extracts factual claims using Claude 3 Haiku (AWS Bedrock)
- Finds supporting sources using Exa Search API
- Summarizes the evidence clearly and neutrally
On screen, a sleek sidebar displays each claim, supporting links, and a short AI-generated summary — updating dynamically as people speak.
🧠 Veriscope doesn’t decide what’s true or false. It simply shows you the evidence so you can decide.
⚙️ How we built it
Frontend:
- Chrome extension that captures tab audio and streams it live via WebSocket
- Real-time sidebar that displays claims, summaries, and sources
Backend (FastAPI):
- Amazon Transcribe Streaming → converts audio to live text
- AWS Bedrock (Claude 3 Haiku) → extracts factual claims & writes evidence summaries
- Exa API (
exa_py) → retrieves relevant web sources - WebSocket stream → pushes results back to the frontend continuously
Everything runs asynchronously for near real-time feedback — usually within 2–3 seconds of someone speaking.
🚧 Challenges we ran into
- 🎙️ Audio streaming: Keeping latency low while handling PCM16 audio over WebSockets
- 🧠 Claim extraction: Prompt-engineering Claude to output structured JSON without hallucinations
- ⚡ Pipeline orchestration: Chaining AWS Transcribe → Bedrock → Exa → Bedrock again efficiently
- 🧩 Summarization balance: Summarizing sources neutrally without assigning “truth” scores
🏆 Accomplishments that we're proud of
- Built a fully working live audio fact-checker in just four hours 🔥
- Integrated three complex APIs (AWS Bedrock, Transcribe, and Exa) seamlessly
- Achieved low-latency real-time response (~2 seconds)
- Designed a clean Chrome UI that feels intuitive and non-intrusive
- Created an approach that promotes transparency over judgment
📚 What we learned
- How to build streaming AI pipelines using AWS services
- The art of prompt chaining and structured output for real-time systems
- How Exa can make large language models more grounded by retrieving live, verifiable content
- The importance of UX design in helping users trust AI outputs
🚀 What’s next for Veriscope
- 🎧 Speaker identification: Attribute claims to specific participants
- 🧮 Credibility scoring: Rank sources by reliability
- 🗂️ Post-event summaries: Generate full recap fact-check reports
- 🌐 Cross-platform support: Extend to podcasts, livestreams, and mobile apps
- 💬 Community feedback: Let users rate and flag claims for transparency
🧩 Tech Stack
AWS Bedrock (Claude 3 Haiku) · Amazon Transcribe · Exa API · FastAPI + WebSockets · Chrome Extension (JavaScript) · Python
🎯 Tagline
“We don’t decide what’s true — we show you the evidence.”
Built With
- amazon-web-services
- bedrock
- claude
- exa
- fastapi



Log in or sign up for Devpost to join the conversation.