TO RUN:

  1. Download Server-side Github and run on local machine
  2. Download Client-side Github to interact with application

Inspiration

In Montreal, the experience of struggling to learn French is a common anglophone experience, further hampered by limited time and resources. 'What?' was developed with this problem in mind, and aims to deliver a frictionless French language learning experience for English speakers.

What it does

'What?' operates as a Google Chrome Extension which runs on every webpage the user visits, seamlessly translating select pieces of English onscreen text (both words and sentences) to French, and inserting the French text (now highlighted) into the webpage. The user can then hover over the highlighted French text to reveal the English translation.

The user can express a high degree of control over the type of content they'd like to see by using the personalization options in the Chrome Extension popup menu. Users can self select their desired difficulty level, resulting in more difficult pieces of text being selected for translation. Users can also select how often they'd like to see the translated text. Lastly, Users can choose which part of language they'd most like to focus on be selecting any combination of nouns, adjectives, verbs and adverbs for translation.

How I built it

Built a python API with Falcon to be run on a local machine using port-forwarding via ngrok. Built separate API routes for translating entire sentences, individual words, and whatever word from the web page that a user highlights. Parsed a large database of root words (ie, dictionary entries) and their relative difficulties. Used nltk (Natural Language Toolkit) for NLP analysis of the main body of text. More specifically, used tokenziation to break up the main body of text into individual words or sentences, part-of sentence tagging to classify individual words as nouns, verbs, adjectives, or adverbs, and used lemmatization to strip suffixes and convert all words to their root word (ex. 'jumping' -> 'jump') such that they could be compared to the dataset of root words and their difficulties. Integrated the above analyses into a selection logic that intelligently selected words out of the main body of text to translate, according to user selected word type (noun, verb, adjective), word difficulty, and translation frequency.

Challenges I ran into

Backend - Confirming that the user submitted preferences were manifested in the backend selection logic properly. Properly integrating this with the supporting English language database and frequency data.

Built With

Share this project:

Updates