Inspiration
The idea for unbias came from the observation that women are consistently viewed in a negative connotation when displaying the same leadership behaviors that men get praised for. Assertive men may be labeled as "confident leaders," while women may be called "bossy" or "aggressive." We dug deeper and found that these words don't only affect a woman's outcomes in the workplace but also in banking as well.
You may ask, how does this workplace bias lead to banking discrimination? It is actually one of the roots of the problem. Banks rely on income and credit scores to make lending decisions, but these metrics reflect years of unfair workplace bias. Women are not risky or untrustworthy borrowers. Their performance reviews limit them from their earning potential and lower their confidence in leading.
Unbias will interrupt this notion. We will address banking discrimination at its source instead of at the bank. Our goal is to lift the weight of systematic banking discrimination against women the moment that biased language enters their performance reviews.
What unbias does
Unbias is a bias detection platform that analyzes performance reviews and suggests meaningful feedback. Users who have an account with unbias may be able to upload PDFs of their performance reviews, and our AI-powered data will scan for descriptions such as "aggressive," "emotional," or "difficult," which are terms that disproportionately appear in women's evaluations. The platform will highlight these terms directly in the text and give financial projections of how this behavior would affect this employee's projected earning potential.
Some of the key features are PDF upload and parsing, the financial impact calculator, and secure user accounts and database through MongoDB Atlas, creating a functional and encrypted website for anyone to use.
How we built it
unbias has three main components: using React frontend for user experience, Node.js for backend processing and API management, and the MongoDB database for secure data storage.
1. Frontend (React)
- Built PDF upload interface with drag-and-drop functionality
- Created document viewer that highlights flagged biased terms in real-time
- Designed interactive charts showing financial impact projections over time
- Implemented responsive UI for mobile and desktop
2. Backend (Node.js)
- Set up RESTful API endpoints for authentication, file upload, and analysis
- Integrated MongoDB for secure user data and document storage
- Built PDF text extraction pipeline using pdf-parse library
- Connected natural language processing bias detection to financial impact calculator
3. Bias Detection (AI/NLP)
- Created database of ~200 gendered terms from workplace bias research
- Built pattern-matching algorithm with context awareness to reduce false positives
4. Financial Modeling
A user enters their salary assumptions and either (a) analyzes sample text to estimate a penalty percent or (b) uses a manual penalty percent, then calls the compensation API to simulate the adjusted salary band and shows the result.
5. Security
- Hashed all passwords in database storage
- Used JWT tokens for secure session management
- Stored sensitive credentials in
.envfiles - Input validation
6. Deployment
- Vercel for hosting the backend, deployed on a github domain
- MongoDB Atlas for cloud database hosting
- Environment variables configured for production security
Design Insights and What We Learned
Actionable feedback matters. Users don't just want to know bias exists; they want to see where it is and how much it costs them. Visual highlighting and financial projections made the tool actually useful.
Privacy is critical. When handling sensitive workplace documents, security-first design (password hashing, JWT, encrypted storage) builds essential user trust.
Log in or sign up for Devpost to join the conversation.