Inspiration
Many of the digital services we use on a day to day basis come with Terms and Conditions (T&Cs). However, these documents are usually walls of text filled with legal jargon that make it impossible for the average user to actually know what they are getting themselves into.
FairTerms is a Chrome extension that reads the Terms & Conditions or Privacy Policies of the current tab and transforms it into a short, straight to the point summary with a “Potential Risks” section, so that users can see what they are really agreeing to.
Features:
One-Click Summarisation With a single click of a button, the extension automatically captures the text from the active page the user is on. This ensures that FairTerms is user friendly as users do not need to copy/paste any texts or scroll through endless texts.
On-Device AI Summarisation FairTerms uses WebLLM to run a small Large Language Model in the browser. This means that all policy texts stay local to the user’s machine, ensuring that no sensitive information is sent online.
Live Progress Feedback When the model is processing, the popup displays a progress bar that is updated in real time. This ensures that the user is constantly updated on the processing progress.
Markdown-Formatted Summaries FairTerms outputs the summary in Markdown, rendered directly inside the popup. This allows for a structured and readable summary for the user.
Focused Summarisation Pipeline Instead of generating a generic summary, FairTerms uses a multi-stage pipeline for its summarisation. Firstly, the texts are broken down into manageable chunks. The LLM then summarises each chunk and highlights key pointers. The summaries from the various chunks are then combined and deduplicated into a clear set of 5 to 7 distinct points. Finally, a “risk analysis” pass flags out potential red flags. This pipeline ensures clarity, completeness and focus on user-relevant risks.
Resilient Background State By default, Chrome extensions reset when the popup is closed. However, FairTerms manages its state in the background service worker. This means that even if the popup is closed while summarisation is in progress, reopening it will show the current progress and final summary without having to restart the summarisation process.
Smart Content Extraction The content script is designed to extract relevant text and ignores irrelevant elements such as navigation bars or scripts. This removes noise and keeps the text concise and clean before feeding it to the LLM.
How we built it
We started by experimenting with chrome extensions and trying out various frameworks to see what would work for the project and our current skill level. We also did some research on in-browser LLM technologies to see what could work for our project. We settled on React + TypeScript for the UI as these were industry standard technologies that worked well with chrome extensions, and WebLLM to handle T&C summaries as they provided well documented API as well as a variety of models which allowed us the freedom of choice.
We settled on React + TypeScript for the UI, since they are industry standards and integrate well with modern extension architecture. For the summarisation engine, we used WebLLM, which allowed us to run models directly in the browser without needing a server.
After deciding on the tech stack, we split the work into:
Frontend/UI: Designing the popup interface, progress bar, and rendering the Markdown summaries.
Extension Infrastructure: Handling background state, content scripts, and message passing between the popup and service worker.
AI Integration: Building the pipeline to chunk T&Cs, run inference with WebLLM, and merge chunk summaries into final results.
Challenges we ran into
Chrome extension constraints: Extensions reset state by default when a popup closes, so implementing background state persistence was tricky.
Model performance vs. latency: Running an LLM inside the browser meant balancing speed and quality. Smaller models were fast but sometimes too shallow, while larger models risked stalling.
Accomplishments that we're proud of
Building a fully functional Chrome extension within 72 hours that actually summarises real T&Cs.
Implementing a resilient state system so that summaries aren’t lost when users close the popup.
Creating a clear, structured summarisation pipeline that outputs not just summaries but also highlights potential risks for the user.
Successfully integrating WebLLM for on-device AI, ensuring user privacy.
What we learned
How to build and structure a Chrome extension with React + TypeScript.
How to integrate WebLLM and manage inference fully in-browser.
The importance of designing for the user journey: clear progress feedback, structured summaries, and risk highlights improved usability dramatically.
How to collaborate effectively under time constraints, splitting tasks clearly and quickly debugging blockers.
What's next for FairTerms
More models & languages: Supporting multilingual summarisation and giving users options for model size vs. speed.
Browser support: Expanding beyond Chrome to Firefox and Edge.
Customisable summaries: Letting users choose detail level (short highlights vs. in-depth explanations).
Mobile integration: Extending FairTerms to mobile browsers where T&Cs are even harder to read.
Built With
- react
- typescript
- vite
- webllm
Log in or sign up for Devpost to join the conversation.