ALIS: Augmented Language Immersion System
яблоко, 苹果, pomme, manzana, apple.
Languages are impossible to learn from the isolated confides of a textbook. No amount of hours can replace true immersion. However, it is often difficult to find an immersive environment locally and impractical to travel globally. Augmented reality offers an opportunity to overlay an immersive environment into our every day worlds.
Using Google Cardboard, an inexpensive virtual reality device, we created an Augmented Language Immersion System (ALIS). ALIS is an immersive environment where users can learn the words for anything around them. Looking through the lens of Google Cardboard, we use MetaMind's image classification machine learning framework to determine the identity of the objects around us. Then, with the power of Microsoft Translator we can identify the objects in any language. ALIS turns the world into an immersive learning experience so that you are never limited by boundaries.
- MetaMind Image Classification - https://www.metamind.io/vision/general
- Microsoft Translator API - http://www.microsoft.com/translator/api.aspx
- Google Cardboard - https://developers.google.com/cardboard/
- Android - https://developer.android.com
- OpenCV - http://opencv.org
- Brent Schlotfeldt
- Kate Tolstaya
- Leah Xu
Created at University of Maryland, College Park's hackathon Bitcamp 2015.
This project used Google's cardboard-java (https://github.com/googlesamples/cardboard-java) and Sveder's CardboardPassthrough (https://github.com/Sveder/CardboardPassthrough) as a base for implementing the 3D camera functionality.