We were inspired to build this app after seeing the countless apps on our cellphones that were ruined by people who turned away, frowned, or blinked at the wrong moment. Sadly, these people didn't accurately describe the emotion of the situation, and created a very different almost jarring vibe when the pictures were viewed later.
What it does
It uses facial recognition to determine wether all of the people in the images are sliming, have their eyes open, and are facing the camera. It syncs all of the pictures it takes with firebase and uses cutting edge api's like Google Vision API and Bluemix API. We also have a web portal/information page were people can learn about our software or even log into firebase to see the pictures they took.
How we built it
We built it twice, once in android studio(android) and once in Xcode(iOS). We used xml/android in android studio IDE, and swift in Xcode IDE for the iOS app. Additionally, we used Atom Text Editor to create a responsive and fluid website.
Challenges we ran into
Live time streaming from the camera to the phone took us a long time. Also, connection to firebase from our website was difficult and time consuming. Additionally, saving Bitmap images to our android system and firebase was also a challenge. Lastly, the actual facial recognition implementation was the part of the project that took the most time. We had to create our own camera app to relay information about the preview image and get data about the people in the frame/track them.
Accomplishments that we're proud of
Two different implementations of the same core idea(one in android and one in ios). Also a web application that could work in conjunction with our two apps. We are proud of our final face recognition algorithm, and creation of our custom camera app.
What we learned
We learned team work and time management. Because as the clock was winding down, we had to focus on our core idea and not get sidetracked by other cool features we wanted to add.
What's next for Risus
Launching on the iOS and app store.