As AR becomes develops and becomes more discrete, we are going to see it more and more in every day life. We were excited by the social side of AR, and how integrating social networks in real life interactions brings the "social" side of social networking back. We wanted to find a way to identify humans through facial tracking/recognition and bring up a known set of social characteristics to augment your interaction.
What it does
Once you've talked to a person for a certain amount of time, SocialEyes recognizes and stores their face on the Android app. It groups those faces by person and you then have the ability to connect their face to their Facebook account. From then on, whenever you encounter that person, SocialEyes tracks and recognizes the face and brings up his or her information in a HUD environment. SocialEyes can even tell you the persons heart rate as you are talking to them.
How I built it
We tuned Haar cascades trained for finding faces in OpenCV to locate the positions of heads in space from the view of the camera of our Meta Augmented Reality (AR) Headset. If we've never seen the face before, we send the face to our back end, which can be accessed and edited using our companion Android app. From here, faces are sent to Azure, where they are grouped using Project Oxford. The groups can be tagged using our Android app, and then linked to Facebook using the Facebook API. The next time that we see the face, we can send the face to our back end, which then sends the face to Azure for identification (among the different tagged face groups we've accumulated). The back end transmits information about the person originally grabbed from the Facebook API back to the computer running the headset, which then displays the information on the AR headset next to the identified person's head.
At the same time, we also calculated heart rates of subjects given only the AR headset's video feed using eulerian magnification, a technique originally developed in MIT's Computer Science and Artificial intelligence laboratory. We captured groups of pixels located on people's foreheads located using a combination of Haar cascades for the eyes and head. We then manipulated these captured groups of pixels using a signal processing library (part of OpenMDAO) for calculation of the subject's heart rates (after a small calibration period).
Challenges I ran into
The scope of this project presented the most issues--we were working on basically three different project that all had to come together and function smoothly. Latency was easily the largest issue we had that covered all three projects. We had to use computer vision to detect and track heads in the frame, while simultaneously updating the UI and sending off our images for external processing in Azure. Concurrently, we had to find the heart rate through image processing. We were also working with a Meta kit that is still only for developers, so there are limitations on things like field of view and resolution that we had to work around. The facial tracking had
Accomplishments that I'm proud of
We're most proud of the sheer complexity of this project. There was so much to do on so many different platforms that we couldn't be sure anything would work together smoothly. Because it was so multifaceted, we had to work evenly as a team and strategically divide up tasks, so we're also proud of how well the team worked together.
What I learned
We learned a lot about Azure and Facebook Graph API, which were both instrumental in our project's success. We had to learn a lot in a very short amount of time, but both ended up working flawlessly with our product.
What's next for SocialEyes AR
What's most exciting about this project is the interaction between AR and humans. AR headsets can only augment reality so much without interacting with humans, and we think this is the next step in human-to-tech interaction. Five or ten years down the road, this is the kind of thing that will humanize Artificial Intelligence--the ability to identify and "know" a human being by face.