Among Quizlet's other forms of using flash cards, we decided to explore auditory learning with them.

What it does

Choose and add Quizlet sets using a web application, where you tag them with keywords, and you can ask Alexa to read you one side of a card and have you decide which of two choices is the other side.

How we built it

We divided into two groups; one group dealt with the web app and one person dealt with interfacing with Alexa. The app group was responsible for a UI and uploading the flash card sets onto Firebase in a format usable for the Alexa team. The Alexa team was responsible for interpreting voice inputs, and using those inputs to access the different sets that were on Firebase. It took a lot of reading through the documentation, especially for the Alexa team.

Challenges we ran into

The majority of our challenges were correctly interfacing with the Alexa, but there were also some issues in formatting data sent to Firebase.

Accomplishments that we're proud of

Our hack has two independent parts, separately designed, that work with each other very well and without issues.

What we learned

One of the biggest things we learned were interfacing with any variety of APIs, and the fact that even big companies have very accessible developer tools that anybody can use.

Getting the webapp

Go to For whatever reason, devpost thinks the link is invalid, so we added it here.

What's next for Quizlexa

We don't know! Perhaps the ability to work through a set of cards at one time a bit more seamlessly, without having to worry about repeated questions. Also, we're excited to see if dynamic slots are something that Amazon adds to their Alexa API.

Share this project: