New York city is very crowded, but finding a date is still a problem. There are a lot of lonely people despite the abundance of online dating apps. We're tired of fake profile images that have 20 filters of them. We're tired of texting for weeks without actually meeting in person. We think that it's time to stop dating your phone. It's time to meet real people around you. And our app gives you a superpower of knowing and seeing who is single physically around you, e.g in the park, college, concert, Starbucks.
What it does
Our app trains and retrains a facial recognition model (algorithmia API) on the fly( a user adds 10 selfies during sign-up). After that the person can be recognized by our app. The app gets the camera feed, detects any faces (with CoreML vision) and if a face(s) is detected, sends it to get a prediction from our facial recognition(algorithmia) model. if the prediction is high enough, it displays the prediction's name and age above a single person's face.
How I built it
We used Swift to work with Algorithmia API. CoreML (Vision framework) to detect a face in the feed. Photoshop/Illustator for UI/UX
Challenges I ran into
Positioning of a plane in iOS ARkit (vector transformation calculations).
Accomplishments that I'm proud of
Great team work.
What I learned
How to train a facial recognition model. How Augmented Reality works. How important UI/UX is.
What's next for DatFace
Implement Augmented Reality(3D) instead of 2D. Add filtering ( by gender, name, languages spoken).