Inspiration

Having lived in Dubai my entire life, I’ve been surrounded by Muslims and have always had a deep respect for their faith and community. But when I was personally curious about Islam, I realized how hard it was to find reliable information online. Most people don’t have access to mentors or scholars, and online spaces are flooded with misinformation — from misquoted verses to out-of-context rulings.

I also saw this gap affect my friends. I watched Muslim friends travel abroad for university, only to find themselves in a new environment where they felt disconnected. They struggled with consistency in their prayers or had new doubts, but they didn't have their usual community or a trusted scholar to turn to for guidance. They didn't know who to ask or how to get back on track, and online forums felt impersonal and unsafe.

I built this project to solve both problems: to make it easier for anyone exploring Islam to verify what they read, and to give Muslims a private, reliable space to find answers and connect with guidance. Instead of falling into endless debates or feeling isolated, users can get accurate knowledge and real support — ensuring truth and understanding always come first.


What it does

Noor is a two-part ecosystem designed to combat misinformation about Islam and provide genuine, human-guided understanding.

The Noor Browser Extension

A proactive defense against misinformation. The extension monitors the web pages and videos a user engages with, detecting common misconceptions, misquotes, or out-of-context claims about Islam.

When it finds one, a simple, non-intrusive bubble appears on the page, providing:

  • A clear correction
  • Proper context
  • Links to verified sources

The Noor Chat & Mentor Platform

A private, AI-powered guide built for authentic understanding. The chat interface runs locally on a user’s device using the Ollama framework, fine-tuned on verified Islamic data sources.

  • Privacy-first: Because the model runs locally, all conversations remain private.
  • AI with humility: When a question requires personal or interpretive guidance, the AI does not guess. It instead offers to connect the user with a verified human mentor.
  • Anonymous mentorship: Users are connected securely and anonymously with a verified mentor who shares their background or context.

How we built it

Our stack was designed with privacy, accuracy, and scalability in mind.

  • Core AI: Built with the Ollama framework for local LLM inference.
  • Model fine-tuning: Trained on a curated dataset of authentic Islamic sources — including the Qur’an, Sahih Hadith collections, classical tafsir, and verified scholarly explanations.
  • Frontend: Built with React and Tailwind CSS for a lightweight, responsive interface.
  • Extension: Developed with JavaScript, HTML, and CSS, leveraging browser APIs for DOM content analysis and real-time misconception detection.
  • Backend: Firebase handles anonymous mentor-matching and secure session management.
  • Local integration: The web client communicates directly with the local Ollama instance through an API layer, ensuring data never leaves the user’s machine.

Challenges we ran into

  • Data integrity: Vetting and structuring a dataset of accurate Islamic information was a major undertaking that required extensive review and validation.
  • Misconception detection: Building logic that can identify nuanced misinformation across varied media without false positives was complex.
  • Local optimization: Running Ollama efficiently across devices while keeping the extension lightweight and responsive required careful optimization.

Accomplishments that we're proud of

  • Building a proactive system that corrects misinformation in real time, instead of waiting for user queries.
  • Creating a privacy-first architecture that runs completely locally, proving powerful AI can coexist with complete user data protection.
  • Designing a human-AI-human loop, where the AI knows its limits and seamlessly defers to real mentors for nuanced guidance.

What we learned

  • The scale of misinformation — and the hunger for authentic knowledge — is far greater than expected.
  • Fine-tuning an LLM for faith-based content requires deep curation and respect for context; accuracy depends more on the quality of data than its volume.
  • The most valuable insight: technology is not the full solution. The real innovation lies in bridging AI assistance with real human empathy and mentorship.

What's next for Noor

  • Expanding the knowledge base: Partnering with more scholars and institutions to continually refine and expand the dataset.
  • Growing the mentor network: Building infrastructure for mentor onboarding, verification, and community development.
  • Multi-language support: Extending Noor’s reach with Arabic, Urdu, and Indonesian language models.
  • Deeper integrations: Expanding the extension to work seamlessly with social media platforms, online forums, and video sites.

Built With

Share this project:

Updates