Inspiration
Even in the day and age, with efforts for feminism, we still see the gaps in compensation within the same roles, company, skills and years of experience. We aim to close those gaps by creating FairPay where we inspire the greater good for everyone with anonymous contribution to provide summary statistics.
What it does
With anonymity, we allow users from various roles and companies to share their pay to provide data for students, employees, and people thinking to switch careers for an idea of the pay.
How we built it
We built FairPay using a React frontend with a JSON-based data layer for storing and serving salary submissions during the hackathon. Authentication is handled via Google OAuth for simplicity and identity verification. LinkedIn verification is integrated to validate that salary submissions come from real professionals. The Stories feed uses a community upvote/downvote system, and salary statistics (mean, median, middle 50%) are computed dynamically from the submitted dataset.
Challenges we ran into
One of our biggest challenges was data integrity — how do you build a trustworthy salary dataset from scratch with no users yet? We solved this with LinkedIn verification as a gatekeeping layer. We also had to think carefully about the game-theory problem: companies or bad actors could intentionally submit low salaries to drag down the average and suppress job seekers' negotiating power. We don't have a perfect solution yet, but we're aware of it and designing around it. On the technical side, structuring the JSON data model to support filtering by gender, role, and background cleanly took iteration.
Accomplishments that we're proud of
We're proud of building something with a real social mission — not just a fun hack, but a tool that addresses a genuine, documented injustice. We're proud of the data model: using the Central Limit Theorem as the backbone of our statistics approach means the platform gets more accurate as more people use it, not less. And we're proud of keeping the UX dead simple — one Google login, no friction, because barriers to contribution hurt the data.
What we learned
We learned that designing for trust is harder than designing for features. Every product decision — why Google login, why LinkedIn verification, why anonymous submissions — had to be justified in terms of "does this make the data more trustworthy?" We also learned how powerful summary statistics are for surfacing inequality that anecdotes alone can't prove.
What's next for FairPay
Beyond migrating to a proper HTTPS backend and database, we want to expand LinkedIn verification to cross-reference job titles and companies for richer data validation. We plan to add statistical outlier detection — flagging salaries that are statistically anomalous for a given role, which could surface discrimination patterns. Longer term, we want to partner with student organizations and DEI groups on university campuses to seed the dataset, and explore whether our anonymized aggregate data could be shared with policymakers advocating for pay transparency legislation.
Log in or sign up for Devpost to join the conversation.