Welcome to Simplext: It's simple text
When highlighting text, or even for the full article, our extension shows up in the context menu.
The red box contains a summary of the full article, as specified when using the extension.
The red box puts the highlighted explanation in simpler terms (meant for a late middle-school / early high-schooler)
This is the response our chatbot gives (in gray), referencing the article as context and interpreting our question (in blue).
We were reading a lot of papers for classes and realized how difficult and uninterpretable some technical resources actually are. Additionally considering it from the perspective of people who are just learning English or young children, we wanted to build an easy-to-use application that will make understanding online resources easier.
What it does
Simplext is a Chrome extension that parses a page and allows the user to query simplifications and summarizations of selected text as well as the entire page using the context menu. There is also an AI chatbot that takes questions about the page and tries to answer them. After these queries, we also highlight the parts of the page related to our AI's response, in order for the user to understand the context behind it.
How we built it
We use the Chrome extension's API to extract the page's text and insertion location of each block of text, then we sent that to our backend which is built Flask and hosted on Google cloud. The backend then preprocess and extracts the key text. This is sent along with the user's query, either simplifying, summarizing or Q&A, and is sent to our model to generate a response. The model is built off of OpenAI's GPT-3 Davinci model with additional prompt engineering to allow for few shot learning, especially for the simplification queries. Once the model generates a response, we communicate it back to the Chrome extension which displays it in its appropriate format.
Challenges we ran into
The main challenge we ran into was the limitation on the number of tokens we could pass into our model per prompt. This occurred mainly since we used few shot learning. In order to fix it, we had to find ways to compress the text into less tokens to prevent overflow while also preserving meaning.
Accomplishments that we're proud of
We are most proud of getting the full pipeline from page opening to displaying results fully functional as well as integrating an attractive UI into our extension.
What we learned
We learned how to create interactive chrome extensions and properly use these large language models, especially with the prompt engineering.
What's next for Simplext
We think that Simplex could fine-tune the model with more data and better examples. Additionally, we could add more features to help understanding like direct word synonyms.
Log in or sign up for Devpost to join the conversation.