Inspiration

EmojiScope was mainly inspired by futuristic augmented reality showcases, which often provide the user with more specialized information than they are able to see with the naked eye.

What it does

EmojiScope allows the user to look through their phone's digital viewfinder and view the world as though everything was represented by emojis.

How we built it

We built upon TensorFlow's Detect Demo for Android to allow for objects to be recognized and correctly interpreted by the phone's camera. After the objects were named and recognized, we applied Twitter's "Twemoji" emojis over the objects and scaled them to fit in order to obscure the objects as much as possible.

Challenges we ran into

Interfacing TensorFlow correctly and efficiently, proper batch image conversion, and wrangling unruly version control.

Accomplishments that we're proud of

We were able to overlay information on top of reality in meaningful quantities, proving that augmented reality is a technology that can be fully realized and employed without specialized hardware.

What we learned

TensorFlow allows for powerful visual and vocal analysis systems to be seamlessly integrated into many applications, and the augmented reality provided by these systems have seemingly endless potential.

What's next for EmojiScope

EmojiScope may soon sport a viewfinder capable of recording or saving photos of the augmented reality it produces. On top of that, it may have uses in educational settings for toddlers or others who have trouble generalizing objects that they see in real life.

Built With

Share this project:

Updates