Earlier this year, I started learning more about image and video processing which later led me to start developing mobile software that uses computer vision (image & video processing are parts of CV) techniques to place 2D and 3D graphics in real world space through the camera, also know as augmenting reality (AR).
Later in June of this year, Apple announced ARKit–an easy-to-use framework that enables iOS developers to place 3D & 2D objects in the real world space through the phone camera. I decided to start using it since the framework allowed me to be more productive as a developer.
But, there was one big issue developers and I faced while using ARKit, and it was the ability to record videos with augmented reality components (retrieving fully rendered frames/buffers). Developers started using alternatives such as screen recording and screenshots in order to record and capture media content with augmented reality components.
So I decided to use my backgrounds and knowledge in image & video processing to fix that issue by building an easy-to-use framework that enable developers capture and record ARKit videos, photos, Live Photos and GIFs; and I called it ARVideoKit.
ARVideoKit has received so much attention and 100s of developers started using it in the iOS community since I published it last week (November 5th, 2017).
The amazing feedback from developers motivated me to build something even better using the Facebook technologies, and I'm calling it GameAR Stream.
What it does
GameAR Stream is an iOS framework built to specifically enable mobile developers implement real-time AR content rendering and allow users to share their augmented reality gaming experience on Facebook Live.
Why does it matter
Currently Facebook is the best platform for daily live video streaming; however, as a mobile gamer and developer I believe Facebook Live has even a bigger potential to be a platform that enables mobile gamers to share their gaming experience live!
How it helps developers
Augmented reality has been a trend recently; yet, we're still in the early stages of developing fully interactive experiences in augmented reality. Furthermore, users LOVE sharing their funny experiences while using augmented reality features (referring to face filters & trends like the AR hot dog).
Moreover, GameAR Stream framework, will enable developers to easily implement RTMP-based Facebook Live Stream and render 2D & 3D scenes in real-time.
Allowing developers to take advantage of users' interest in sharing their augmented reality experiences, and scale their games to reach more users!
GameAR Stream framework does the following to take away the complex process:
- Renders 2D & 3D augmented reality scenes in real-time.
- Renders 2D & 3D game scenes in real-time.
- Starts an audio session.
- Prepares an RTMP session.
- Connects to developer's Facebook App ID.
- Connects to user's Facebook account using Facebook's LoginKit.
- Begins a Live video stream to Facebook using Facebook's CoreKit and ShareKit (AccountKit).
How I built it
Using the Facebook LoginKit, CoreKit, ShareKit (AccountKit) and real-time messaging protocol (RTMP) technologies I was able to connect to the Facebook live stream API and push audio & rendered game scenes in real-time.
Then I used ARKit, AVFoundation, Metal, and CoreMedia frameworks in order to render the phone camera stream with the augmented reality components and game scenes in real-time.
GameAR Stream is a CocoaTouch framework written in Swift 4.0, which means it's very simple to implement it in any iOS application!
Challenges I ran into
I have ran into many challenges while building this framework and here are 2 main challenges I ran into and how I solved them:
Rendering 2D game & augmented reality scenes – solved by developing a functionality that uses Metal framework and 3D renderer to diffuse the 2D scenes materials.
Creating an RTMP session and pushing rendered buffers to the RTMP session – solved by developing a functionality that used AVFoundation, Foundation, and rtmpdump C Library in order to start an RTMP session and push the rendered buffers to the session!
Accomplishments that I'm proud of
I'm mainly proud of how I was able to figure out creating RTMP sessions and pushing rendered audio and video buffers in real-time from a mobile device!
What I learned
While building this framework, I have learned many new things related to real-time messaging protocols and networking.
What's next for GameAR Stream
Currently GameAR Stream is under development and I'll be releasing the first public version of the framework in January of 2018.
Furthermore, when I publish GameAR Stream, I will provide more features to enable developers easily customize it and allow them to do additional image/video and audio processing (such as video & voice filters)!