Evidence can help strengthen and clarify a case, but obtaining video proof isn't very simple. This is especially true for violent incidents, which can happen in a matter of moments and result in long lasting consequences. We believe people should have a way of recording incidents of their own, promoting accountability against unlawful acts without the threat of being identified.

What it does

Opening the AnonymEyes app allows users to instantly stream low resolution video anonymously from their mobile phone to the AnonymEyes website. There is no local copy of the video, only a remote one. The videos are also location tagged and placed on a map. This data can act as a way of rapidly and discreetly alerting emergency responders while recording valuable information to them at the same time, rather than dialing EMS.

How we built it

The two main interfaces to AnonymEyes are an Android app that allows the user to instantaneously record and upload a video stream, and a web application for the public to view uploaded videos.

The default Android video recorder only returns a video file after the 'Stop Record' button is pressed. To achieve near-continuous live video stream, we needed to hack a way around the default API. Frames of the camera preview were taken at a fixed rate and sent via UDP packets to our Java processing server. On the Java server, the frames were compiled into a H.264 mp4 file for cross-compatibility across all web browsers. The video file is then uploaded to a shared file system between the Java server and a Ruby on Rails web server, both running on the same VM on Google Cloud Platform.

Two requests are made from the Java server to the Rails server to indicate a new stream. An initial request indicates the location of the anonymous user and plots a Google maps marker. A final request is made once the video file is encoded to notify the client to load the HTML5 video.

The front end built using Angular.js is dynamically updated by listening to a Firebase database.

Challenges we ran into

Hacking a live video stream was a whole challenge in and of itself. It was imperative that the video was uploaded as soon as possible to the remote server due to the nature of our problem. We initially thought of transmitting whole frames of the video via UDP to our Java server. We realized that most network routers do not deal well with packets of size 1000+ bytes, therefore we had to create an algorithm to manually splice the frames into smaller vertical strips of data.

The second major challenge we ran into was trying to get a Rails websockets configuration working on Google's Cloud Platform (GCP). Since Rails does not natively support websockets (yet), the gem we used creates a standalone Thin server to route websocket activity. After much frustration, we came to the conclusion that GCP was routing websocket traffic to the HTTP server (we were getting an "unexpected response code: 200 OK" error, which is quite counterintuitive to begin with). Ultimately, we had to leverage Firebase to simulate websockets/server-side events. We can send regular HTTP requests via Firebase API and listen to changes in the Firebase database on the client side.

Accomplishments that we're proud of

We built an entire live video stream protocol (everything but the actual nitty-gritty encoding). We were also able to interface a Java server, a Rails server and mobile events without much difficulty in a short span of 36 hours.

What's next for AnonymEyes

We would like to make this service available for free to the general public.

Share this project: