Loquela is an iOS app that helps users learn foreign languages and vocabulary terms that they are most likely to encounter in real life. For example, instead of focusing on learning a set of obscure, rarely seen Spanish terms (like I do in my high school Spanish class), Loquela helps those visiting foreign countries and students studying abroad to practice their vocabulary on objects that they see in front of them and objects that they are more likely to encounter on a daily basis.

What it does

Loquela is able to detect and recognize in real-time objects placed in front of the camera. Then, it translates the object into a language selected by the user, and displays the text in Augmented Reality onto their phone screen. Loquela then generates a set of flash cards containing the translated words, and users can study from that set of flash cards to further improve their vocabulary.

How I built it

I used Swift to build Loquela. For object detection and recognition, I used Apple's pre-trained machine learning model: Inception v3. For the AR features, I used Apple's ARKit. The rest (like the flash card features) are made with basic Swift Kits like Collection Views and Table Views.

Challenges I ran into

Combining the AR elements and Core ML frameworks was tricky, and I stayed up for the entire Hackathon coding my project.

Accomplishments that I'm proud of


What I learned

Machine Learning in Swift, stronger grasp of Swift Kits.

What's next for Loquela

Shift to a web platform, and also more features to learn vocabulary besides just the flash cards feature.

Share this project: