Inspiration

As now-a -days everyone like to share everything and the best platform for sharing is the facebook. But on facebook an user cannot share his/her gaming experience with everyone. So after looking at this I decided to build something for gamers through which they can share their experience with others and let others to enjoy the game also.

So I decided to use my backgrounds and knowledge in image & video processing to fix that issue by building an easy-to-use framework that enable developers capture and record ARKit videos, photos, Live Photos and GIFs; and I called it ARVideoKit.

The amazing feedback from developers motivated me to build something even better using the Facebook technologies, and I'm calling it GameAR Stream.

What it does

GameAR Stream is an iOS framework built to specifically enable mobile developers implement real-time AR content rendering and allow users to share their augmented reality gaming experience on Facebook Live.

How I built it

Using the Facebook LoginKit, CoreKit, ShareKit (AccountKit) and real-time messaging protocol (RTMP) technologies I was able to connect to the Facebook live stream API and push audio & rendered game scenes in real-time.

Then I used ARKit, SparkAR, AVFoundation, Metal, and CoreMedia frameworks in order to render the phone camera stream with the augmented reality components and game scenes in real-time.

GameAR Stream is a CocoaTouch framework written in Swift, which means it's very simple to implement it in any iOS application!

Challenges I ran into

I have ran into many challenges while building this framework and here are 2 main challenges I ran into and how I solved them:

Rendering 2D game & augmented reality scenes – solved by developing a functionality that uses Metal framework and 3D renderer to diffuse the 2D scenes materials.

Creating an RTMP session and pushing rendered buffers to the RTMP session – solved by developing a functionality that used AVFoundation, Foundation, and rtmpdump C Library in order to start an RTMP session and push the rendered buffers to the session!

Accomplishments that I'm proud of

I'm mainly proud of how I was able to figure out creating RTMP sessions and pushing rendered audio and video buffers in real-time from a mobile device!

What I learned

While building this framework, I have learned many new things related to real-time messaging protocols and networking.

What's next for FaceBook Gaming with GameAR Stream

I am looking to expand the domain of virtual sharing like live sharing the experience of museums , monuments , movies with augmented reality.

Built With

Share this project:

Updates