Inspiration
The inspiration came from the need for a more intelligent, integrated, and completely offline way to retain information from the web. Traditional flashcard apps require manual entry and cloud dependency, which are barriers to learning. We wanted to blend the power of the on-device Large Language Model (LLM) with the proven effectiveness of spaced repetition to create a seamless, zero-cost, low-friction tool. The goal was to build a 'digital memory assistant' that lives right in the browser, using only the Chrome Built-in AI for intelligence.
What it does
The "Chrome-Remember-LLM" is a Chrome extension designed to fully automate the process of creating and managing spaced-repetition study material directly from any webpage, running entirely on the user's device.
Key functionalities include:
- On-Device Content Acquisition: It is specifically designed to grab the content from the current webpage, sending that text to the local LLM.
- AI-Powered Fact Creation (Offline): The Chrome Built-in AI processes the acquired webpage content, automatically structuring it into useful hint and description pairs for study.
- LLM-Based Adaptive Testing: When a user reviews a fact, the local LLM is used to determine the accuracy of the user's reply, creating an intelligent and adaptive grading system that runs completely offline.
- Local Spaced-Repetition Engine: A client-side algorithm calculates the optimal review schedule, which is dynamically visualized using the Vis.js Timeline library.
- Private Data Storage: All user data and the learning timeline are stored securely and privately in the extension's local browser storage.
How we built it
The extension was built using a standard Chrome Extension architecture, strictly adhering to the user's teaching preferences for clarity, simplicity, and local execution:
- Simple Client-Side Code: The extension is built with basic HTML, minimal inline CSS, and simple, vanilla JavaScript for the UI and logic.
- Chrome Built-in AI Prompt API: This is the core engine. It handles all complex tasks, including: grabbing page content, data extraction, summarization, hint/description generation, and the crucial step of evaluating the user's recall—all performed on-device by the local LLM. No external serverless function or API endpoint is required.
- Local Storage &
vis-timeline.min.js: The scheduling logic and all user data are managed and stored locally within the extension's browser storage. Thevis-timeline.min.jslibrary is used for the elegant, visual representation of the memory timeline. - Coding Style Compliance: The JavaScript uses
async/awaitfor cleaner promise handling and features descriptive camelCase variable and function names starting with "my" (e.g.,myProcessPageText(),myMemoryTimeline), making the code transparent for students.
Challenges we ran into
- Prompt Engineering for Structured Output: The greatest challenge was engineering a robust prompt to consistently force the local Chrome Built-in AI LLM to output both the structured fact pairs (Hint/Description) and a reliable assessment (Correct/Incorrect) of the user's recall, all in a format the simple JavaScript could parse.
- Accurate Fact Extraction (Zero-Shot): Getting the LLM to consistently and accurately extract only the most critical facts from a potentially cluttered webpage in a single, zero-shot call without manual filtering.
- Local Execution Constraints: Ensuring the entire memory and learning loop (Content Grab, Extraction, Scheduling, Review, and LLM Assessment) performs quickly and smoothly while relying entirely on the on-device AI and local browser resources.
Accomplishments that we're proud of
- Fully Offline, LLM-Driven Learning Pipeline: We successfully built a tool that handles the entire study pipeline—from content acquisition to adaptive scheduling—using only the local Chrome Built-in AI. This provides unparalleled privacy and zero operating cost.
- Adaptive Learning Loop: Implementing the dynamic feedback loop where the local LLM's judgment instantly recalibrates the user's learning timeline is a major technical achievement that makes the learning process truly adaptive.
- Simplicity as a Feature: Successfully maintaining clean, simple JavaScript code while tackling such a complex problem allows the extension to serve as an ideal educational example for students learning how to integrate on-device AI.
What we learned
We gained significant expertise in prompt engineering for local LLMs, specifically how to structure complex, multi-stage requests (extraction, summarization, and assessment) into a single call that yields clean, parseable data. We also deepened our understanding of the spaced repetition algorithms and how to implement their principles (like the E-Factor) within a live, interactive, and client-side-only browser environment.
What's next for Chrome-Remember-LLM
- Local Data Management UI: Enhancing the side panel with better tools to view, edit, and categorize the locally stored flashcards.
- Visualizing Progress: Leveraging the Vis.js timeline to show the user's actual memory growth and retention rate in a visually compelling way.
- Content Script Refinement: Improving the content script's ability to extract text only from the most relevant sections of a page (e.g.,
<article>,<main>) to feed the local LLM cleaner data.
Built With
- built-in-ai
- chrome
- html
- javascript
- llm
- remember

Log in or sign up for Devpost to join the conversation.