What inspired you to build this app/project and what does it do
YouAnchor is an iPhone X first app that lets you be your own avatar anchor in a stylized version of the real world. Use your face to puppeteer your personalized avatar. Star in your own production with a scenic view of wherever you’d like to share (powered by WRLD3D). Live stream it when you’re ready!
Choose from different camera filters to set the scene and mood. From a stylized black-and-white sketch to oil paints and pastel cities to giant Times Square LEDs - or back to the real world and more.
Be careful - in ZOMBIE and MONSTER filter modes, real zombies and monsters will show up on the buildings where you tap. (But, don’t worry, you get finger powers to zap them away!)
Of course, it wouldn’t be your world if you can’t put what you want in it. Touch the Polygon (Dodecahedron) icon to go from Location Search to Google Poly Mode, where you can search from thousands of 3D models to add to your scene (and upload your own)!
Initially I started with something darker - tad-sci-fi reminiscent of Ingress meets that video game where the characters used maps by projecting a holographic city. I mocked up an ARKit World-Facing flickering hologram AR city and then I looked at the contest prize categories of “Most Useful” and “Most Immersive”. It didn’t seem super useful, other than to impress your friends and make real life seem like science fiction. It also didn’t seem creative enough - nor does it spread creativity. Indeed, it seemed the epitome of indoctrination.
I started playing with a citizen journalism idea and mixed that with my work on realtime face tracking to puppeteer avatars. For a stylistic look, I went with cartoonish sprites that required its own animation system - not just mapping the blend shape modifiers in a 3D model. While the app currently only works with the iPhone X due to its face tracking features, I kept the parameters minimal so that I could later adapt this to work with any other iPhone through OpenCV/DLIB.
I wanted to also include video footage of “the real world”. Unfortunately, ARKit currently does not maintain its tracking when switching from front to rear facing camera. This meant that every time the user switches from “talking as their avatar” to “filming the real world”, they’d have to re-track.
This app is another evolution of a series of AR camera / “pro-sumer” content creation app I’ve been building for the last year. Each one is a different mix / focus.
The name was initially “Anchor Yourself” and then “Anchor You”. YouAnchor.cam turned out to be available - and it reflected the nature of the app a lot more… because it’s about making videos (and live streaming) with YOU, the anchor!
- WRLD3D :)
- Google Maps Geolocation API (to get address to lat lng)
- Unity 2017.3.0f3 via RealityScript Framework
- ARKit Face Tracking (iPhone X only)
- Google Poly SDK (3D model repo!)
- ReplayKit (live streaming, iOS framework)
- (Coming soon) OpenCV + DLIB for other phones and Android
Art - open source and/or used with permission
- Avatar Art: Avataaars by Pablo Stanley
- Cat Glasses by Yosun Chang (me, my glasses for my avatar)
- MONSTER filter 3D model: Title: OctopusAuthor: Poly by GoogleURL: https://poly.google.com/view/9-b6-yqr... Creative Commons CC-BY
- ZOMBIE filter 3D model: Title: Santa zombieAuthor: Nouri MohamedURL: https://poly.google.com/view/1kZBKDq_... Creative Commons CC-BY
See my livestream from a kitty cafe in San Francisco! https://www.youtube.com/watch?v=awv1etRmz7k
Log in or sign up for Devpost to join the conversation.