Inspiration

Augmented reality (AR) is new technology that seems to gaining attention quickly. There are not many existing AR projects, and there are many developers in forums asking how to start an AR project, especially with live video. Most of these questions go unanswered, so we have decided to build our own AR application, incorporating live streaming into the virtual environment. We are hoping to bring AR to video conferencing and video chatting.

What it does

Our application is a proof-of-concept that renders a live streamed video feed as a texture while removing the background of the feed to isolate the person in front of the camera. When the receiver taps on their touchscreen, the texture of the sender is then placed in the receiver's view on their Android device, where the application then detects horizontal planes in the image so that it can place the augmented video within the receiver's field of view in a holographic style.

How we built it

We built this application largely in Unity, utilizing C# and ARcore. We started a live stream and linked the live stream to our AR application, which takes the video feed and renders a texture a specified object (2D or 3D), including sound. We placed this 2D image as a plane's mesh material, and used a chroma key shader (shader language script) to remove the green background in the green screen, where the video feed sender is physically positioned. The application on the receiver's phone then tracks the available planes using a C# script, and listens for the user to tap somewhere on a plane. When the receiver taps on a plane, the application places our object (the video stream cutout) where the user tapped. It allows for more streams to be placed, not just one, so that it can be useful in digital conferences.

Challenges we ran into

One of our biggest challenges was editing a live video to remove the green screen. We had hoped to have time to add functionality for recognizing a variety of backgrounds and removing them, but due to the challenges with the green screen, we were unable to implement the feature during the event.

Accomplishments that we're proud of

All of us are new to Unity and two-thirds of us are new to C# so we had many things to learn in 24 hours. We are especially proud of the achievements we made in Unity, such as rendering live video on a 3D object and implementing plane tracking.

What we learned

We learned that the areas we assume will not be difficult (e.g. green screens) are sometimes the most difficult piece of the puzzle. We also gained invaluable experience using Unity and ARcore.

What's next for ARmeeto

We plan to upgrade the live stream so that a green screen is not needed and then implement a 3D rendering of the sender, rather than a 2D video stream. We'd also like to add additional support for other platforms, two-way functionality, and improve the user experience in general. We look forward to developing this application further!

Built With

Share this project:
×

Updates