Inspiration

Health misinformation spreads faster than verified medical advice, especially through social media and messaging platforms. False cures and misleading claims often cause panic, delayed treatment, and loss of trust in healthcare systems. We were inspired to build Health Fact Guardian to create a system where medical truth is not only accessible, but verifiable, explainable, and protected against manipulation using AI and blockchain.

What it does

Health Fact Guardian is an AI-powered, blockchain-backed platform that verifies health-related claims in real time. Users can paste any health rumor to receive a clear verdict—True, False, Misleading, or Unverified—along with severity level, explanation, and trusted references.

If a claim exists in the on-chain registry, the verdict is returned instantly. If no verified record exists, Ollama LLM performs medical reasoning using trusted evidence patterns to assess plausibility and risk, ensuring users still receive guidance instead of uncertainty.

Authorized healthcare organizations can publish verified facts that are permanently recorded on the blockchain.

How we built it

We built Health Fact Guardian using a React frontend and a FastAPI backend. Incoming health claims are canonicalized and hashed using SHA-256, then checked against a HealthFactRegistry smart contract deployed on the Somnia Testnet.

For claims not found on-chain, we use Ollama LLM (self-hosted) to:

Analyze medical validity

Classify misinformation severity

Generate explainable reasoning

Suggest authoritative sources

Healthcare organizations authenticate via secure login and MetaMask wallet integration to publish verified medical facts directly on-chain.

Challenges we ran into

Designing a medical reasoning pipeline that is accurate, explainable, and safe

Ensuring Ollama LLM outputs remain conservative and non-harmful

Handling paraphrased and semantically similar health claims

Keeping blockchain interactions fast and user-friendly

Mapping real-world healthcare authority trust into decentralized systems

Accomplishments that we're proud of

Successfully combined AI medical reasoning with blockchain immutability

Built an offline-capable verification system using self-hosted Ollama LLM

Created a tamper-proof public registry of verified health facts

Designed a clear and user-friendly Truth Verdict Card

Eliminated dependency on external proprietary APIs

What we learned

We learned that AI alone cannot be trusted without accountability, and blockchain alone cannot reason. Combining Ollama LLM for intelligence and blockchain for trust creates a powerful system for public health. We also learned the importance of explainability when dealing with medical information.

What's next for Health Fact Guardian

We plan to expand multilingual support, improve medical reasoning models, integrate WhatsApp and social media ingestion, onboard global health authorities, and add deepfake medical content detection. Our long-term goal is to establish Health Fact Guardian as a decentralized global standard for health truth verification.

Built With

Share this project:

Updates