What is recognEYES?

An iPhone app that engages children, who are in the early stages of learning how to speak and read, with the world around them. When the child taps on an object in the camera view, recognEYES narrates what's in front of them both orally and visually.

Purpose

To facilitate the early stages of learning in an engaging and effective way. RecognEYES utilizes what's familiar to the child, their immediate surroundings, and interacts with multiple senses for an enhanced learning experience. RecognEYES aids children, whose parents are often occupied with work or are not fluent in the target language, in building essential language foundations at this crucial age.

What we used

We used Microsoft Cognitive Services' Computer Vision API and Translator API. In addition, we used Xcode, Swift, and the Apple ARKit.

Features

  • Both a word and phrase mode for different levels of learning.
  • Different target language options, to aid bilingual and immigrant households, or children who would like to learn another language.
  • Audio feedback to add another layer of learning and to assist with pronunciation.

Developers

Brian Li, Christine Wang, Barbara Xiong, and Daniel Zhou -- a team of students at Duke University.

Built With

Share this project:
×

Updates