Inspiration

So I guess there are a few different inspirations for the different moving parts:

The first inspiration was Ninja (controversial I know). He used to have a little icon representing Victory Royale's on his channel. This would be updated manually by moderators, or himself, whenever he got a win. I thought it was a cool idea as any new viewer can instantly see how many victory royales he had as soon as they started watching. It gives the viewer some context of the stream they've missed so far.

The second inspiration was the fact that I'm a data scientist and love machine learning. Especially convolutional neural networks (CNNs)! I've been meaning to do some neural network learning in my own time for a long time, and this seemed like a perfect opportunity to apply some convolutional neural networks to Fortnite.

And I suppose to mention my third inspiration is just that I love Fortnite and watching Fornite streams so was really excited about the opportunity to build something for it!

What it does

It's as simple as it sounds. The extension is a video overlay which contains a little trophy icon placed on the right hand side of the video. Every time the streamer gets a victory royale it gets updated to show the number of victory royales that have been achieved during the current stream.

I feel it's main feature is that it works without having to install any third party software. All you need to do is install the extension.

I was aware of a lot of the software that streamers could install onto their PC so that an extension could have access to game data. However, I believed that even this small step of installing some software would be enough to put some streamers off using the extension.

I'm pleased to say that the victory royale tracker works without any configuration or third party software. A streamer simply has to have it installed and start their stream and the victory royale tracker will work in the background. And I think this is the part of the extension which I'm most proud of.

Edit: My original system meant that a broadcaster would have to start the tracker by visiting their own channel while live and logged in. Lots of console players were using the extension who didn't want to do this. So I have now implemented a system in which the tracker will start automatically within 12 minutes of a stream starting. You can still start it "manually" by visiting your own channel while you're live.

How I built it

There are three main components:

  • The front end - this includes video_overlay.html and some javascript to call the EBS
  • EBS - this is a simple Python Flask server
  • Extended EBS (E-EBS) - this is a python program which runs PyTorch

I don't know if E-EBS is a very good name but it's the best I've got!!

I'll write a small bit about all of them.

Front-end

The front-end is as simple as it gets. The video_overlay.html only contains one element and that is the one to show the trophy icon and the number of victory royales. This is linked to a javascript file called viewer.js. In this we use the twitch.onAuthorized callback to do 3 things:

  1. Get the current number of victory royales from the configuration service if available

    JSON.parse(twitch.configuration.broadcaster.content).victory_royales

  2. Set up a PubSub listen

    twitch.listen('broadcast', (target, type, msg) => {...});

  3. Make a call to the EBS but only when the viewer is the broadcaster

    if (twitch.viewer.role === 'broadcaster'){
            $.ajax(requests.auth);
        }
    

The main goal when designing the front end was to minimise the traffic on the EBS because it's set up on a tiny AWS EC2 instance. Thankfully, Twitch provides both the configuration service and extension PubSub which allowed me to make very few calls to the EBS.

EBS

The EBS is a Flask server, written in Python 3, set up on an AWS t2.micro EC2 instance. In conjunction with this is an RDS PostgreSQL database. The EBS in this case only has one job which is related to when a broadcaster views their own stream.

When a broadcaster views their stream, a call is made to the EBS. The EBS checks two things:

  1. That the broadcasters stream is live
  2. If current stream is already being tracked by an E-EBS

If the EBS queries the database and doesn't find the broadcasters current stream, it starts a dockerised container (an E-EBS) which begins to "watch" and track the broadcasters stream for victory royales.

Extended EBS (E-EBS)

The key part to this is that one E-EBS can analyse one stream. Therefore if there are 10 streamers live, with the Victory Royale tracker installed, we would have to start 10 E-EBS's to independently watch each stream

A fully dockerised container, this is the part of the system which takes frames from a broadcasters stream and analyses them in real time using computer vision techniques. The process looks like this:

Get a frame from the stream -> Classify frame as victory royale or not using a PyTorch CNN -> If there's a victory royale, send an event through PubSub and also update the broadcaster configuration

We repeat this process until the broadcast has finished. When the broadcaster finishes streaming we update the broadcaster configuration accordingly.

To train the PyTorch model, I collected ~3000 images from Fortnite streams and manually classified them. There was approximately 1500 images showing victory royales and 1500 images showing normal, non victory royale, gameplay.

The model I eventually picked was MobileNetV2. The main reason for this, as I'll discuss in the challenges section, was it's speed. It can classify a 720p image in ~3 seconds (and 1080p in ~5 seconds) on the architecture I'm using (which is CPU based to keep costs low).

The architecture for this involves two AWS services:

  • ECR - this is where I host my docker image for the E-EBS
  • Batch - when a process is started, batch receives a command from the EBS to start the job - each job is specific to a broadcaster. It will run it on an EC2 instance, within a docker container based on the docker image hosted on the ECR, until that broadcaster's stream is stopped.

Challenges I ran into

I ran into a lot of small challenges. I'm only going to outline a couple for each part of the extension

E-EBS

This was the easiest part because I have experience in machine learning with Python. The initial set up of a Python program which takes frames from a video/stream and analyses them with a neural network was quite easy. The main challenge that I ran into was when I was doing some of my final testing.

I trained my initial model in Tensorflow. It was quite a big model as it was designed to be able to do a few things - not just tracking victory royales (I speak about my original plan in the What's Next section). As I was testing I realised the tracker was missing some victory royales. When I looked in the logs of the E-EBS jobs, I could see that, mainly due to my tight budget on AWS and hence small instances, it was taking ~18 seconds to analyse a frame. That meant that the streamer would need to stay on the victory royale screen for at least 18 seconds to 100% guarantee that it would be tracked. This obviously wasn't feasible so I moved to a different model MobileNetV2 by Google. It's designed to have very good accuracy with little compute power. Given that I had to retrain I also decided to switch to PyTorch as I prefer it. I retrained my model and, on the same compute power, reduced the time to analyse a frame from 18 seconds to about 3 seconds (for 720p).

EBS

This part was still relatively easy. I've written basic Flask apps before so the initial set up of this was easy. One thing I've never done though, is put a Flask app into production. This is when I started to run into most of the challenges. The challenges were:

  • Setting up a WSGI server (I used gunicorn)
  • Using a reverse proxy (nginx)
  • Making my server HTTPS compliant (This took sooooo long)
  • CORS. To be honest, I still don't really know what this is but it's no longer a problem.

Thankfully, there's a lot of content on the internet which I was able to wade through and successfully get my EBS up!

Front-end

With the front end, we are now into the land where I know absolutely nothing. I have only been a data scientist for two years and in that time I have touched approximately zero javascript. Where shall I start...

  • JWTs - I had no idea what they were, how to decode them, what base64 was etc. Thankfully the official website is good. But even then I'd say my knowledge of them only became properly complete towards the end of the project.
  • twitch.onAuthorized - I still don't really know what it does, but I now know, thanks to the discord, that you should do basically everything inside of it! (a GET request doesn't seem to work outside of it)
  • Packaging stuff to upload - I'd been making good use of the sample Repos on the twitch-dev GitHub. So when I tried to use jquery by taking it from the jquery CDN it didn't work (at least that's what I assume I was trying to do). I also had to package font-awesome with my files. This sounds like a simple issue - took me a long time to figure it all out!

Thankfully all the Twitch documentation is good so I was able to pick up most things reasonably quickly. Also I'd like to say that the Discord has been extremely helpful whenever I have run into problems. They were also friendly - I never seemed to burden anyone with my stupid questions!

Accomplishments that I'm proud of & What I learned

I've put these two sections together because I do think the thing I'm most proud of is how much I've learned. I've managed to build a product, from the front end all the way to a fairly complex backend which starts and stops servers on demand. I've dockerised everything for easy deployment. I've set up my own docker repository on AWS. I've built a simple front end which interacts with a backend making significant use of Twitch's configuration service and broadcaster PubSub. I've set up a HTTPS server. I can now encode and decode JWTs with and without verification. Most of this stuff I had never heard of before I started this project. There were times when I really had to persevere with stuff because it seemed like I would never get it working. I also have a deep neural network which can classify fortnite images as victory royales or not.

I'm also proud of the fact that I now have a product which can be used by the public! I've never really released a product to the public before :) (I've put some scrapers on GitHub but it's not the same)

What's next for Victory Royale Tracker

As time went on I realised my complete lack of knowledge in javascript and web servers meant I'd need to walk before I could run. I therefore limited the scope to just tracking victory royales which is the most basic version of the overarching goal - build something that analysed a stream in real time from the raw frames of the stream.

I have a prototype which can not only detect victory royales but also pick up how many kills the streamer has, how many people are left in the lobby etc. I also made a model which can find the map from the image - my plan with this was to track what places a streamer was visiting while they were playing using some more advanced computer vision techniques.

My absolute stretch goal, although I feel I would need more time and more compute power than I have, would be to have a sort of Augmented Reality system for the stream. Imagine a viewer hovering over the stream and friendlies/enemies are highlighted in real time with some information about their skin, how much it costs in the store. Objects are highlighted and estimated materials gained from that object if it were to be farmed etc. I believe this would definitely be possible, but you would need GPUs to be able to process the frames fast enough to make it "real-time". This obviously starts to become expensive but I think as far as viewer experience goes it would be amazing!

Share this project:
×

Updates