Inspiration

While working on my earlier project Mood2Mail, I realized how often communication can carry hidden bias — not just in tone, but in the subtle language choices we make every day. I wanted to build a tool that does more than just analyze sentiment — something that actively flags microaggressions, gender-coded language, and tone, while suggesting inclusive rewrites. Inspired by SDG 5 (Gender Equality), BiasX-Ray aims to be your real-time lens into inclusive language.

What it does

BiasX-Ray is a real-time bias detector for team chats. As users type messages, it:

  • Detects tone using an ML model (e.g., friendly, anxious, aggressive).
  • Flags gender-coded words and microaggressions.
  • Suggests inclusive rewrites.
  • Promotes non-binary inclusive communication without being preachy.

How I built it

  1. Frontend: HTML, CSS, and JavaScript — interactive interface, live highlighting, and visual feedback.
  2. Backend: Flask-based API using ML model (TF-IDF + Naive Bayes) for tone detection.
  3. Bias Detection: Pure Python logic for gender-coded terms, pronouns, job titles, and soft microaggressions.
  4. Artwork: Custom logo and illustrations reflecting all genders holding hands.

Challenges I ran into

  1. Finding the line between biased and neutral phrasing — context matters.
  2. Avoiding false positives without under-detecting.
  3. Designing logic that works for non-binary and gender-diverse users.
  4. Merging ML tone detection with logic-based bias spotting smoothly.

Accomplishments that I am proud of

  • Built a fully working prototype with real-time bias feedback.
  • Integrated tone detection from a previous project (Mood2Mail) — reused, improved, and expanded!
  • Created a visually appealing and inclusive design.
  • Developed meaningful logic that actually teaches better phrasing.

What I learned

-Language inclusivity is more than removing “he/she” — it's nuanced. -Real-time analysis can be smooth without sacrificing speed or UX. -Ethical tech requires both sensitivity and structure — it’s not just a logic problem.

What's next for BiasX-Ray: Real-Time Bias Detector

  1. Add more tones and emotional cues.
  2. Build a Chrome extension for Slack/Discord/Teams.
  3. Add a feedback button to let users mark incorrect flags.
  4. Expand detection beyond gender: race-coded terms, ableist language, etc.
  5. Add team analytics dashboards to reflect communication trends.
Share this project:

Updates