I came up with the idea because of my tutor at University of Birmingham, who cannot remember faces. So I decided to build a mobile application for people with this kind of desease (Alzheimer), who cannot remember associations of faces with names.
What it does
The mobile application allows the user to develop their own face database, by being able to take a picture of a humans face, associate a name to it and save it to the database. Then they are able to recognize a face, by taking a picture of the desired person and get the name of the person returned.
How I built it
I build it using Swift, Apple's lates programming language.
Challenges I ran into
Using the Microsoft Face API with Swift was quite challenging, since there is no tutorial in Swift, just in Objective-C.
Accomplishments that I'm proud of
I am happy that I have a working MVP and it can MOST OF THE TIME recognize faces correctly.
What I learned
I learned a little bit about face recognition methods and how to use the Face API.
What's next for DrFace
The next thing that I would do is to create my own face recognition API, because I found that the Face API is not always really accurate. And after that I would start optimizing the mobile app and add different features to it, like user browser or speech recognition for name input for images.