The skincare world can be very nontransparent in the ingredients they use, and it can be really hard to figure out which ingredients agree with your skin and which don't. The names used in ingredients lists can also be really hard to interpret, and it can be hard to search for one ingredient in a list of many. We wanted to simplify this process through our iOS app.
What it does
Our app primarily uses MLVision and Firebase (imported from Cocoapods) to transcribe the text from any given image, mainly pictures of ingredients lists. It then takes the transcribed text and puts it into a database so the next time someone takes a picture, it cross-references common ingredients.
How we built it
We used Swift to build the app itself and Firebase (with all its different pods exported from Cocoapods i.e. MLVision). We then developed the UI of the app itself and then connected the app to our database.
Challenges we ran into
It took a really long time to figure out how to use the MLVision aspects of Firebase and to extract the text from the image itself. We also had a difficult time configuring the iOS app to Firebase itself.
Accomplishments that we're proud of
We are really proud of the fact that although none of us used any of these technologies before the start of the hackathon (Swift, Firebase, MLVision, Cocoapods, UI development), we developed a fully working app that fits the goals of our projects.
What we learned
We learned a lot of new technology and how to implement them properly.
What's next for Skinergy
We plan to make the cross-referencing algorithm smarter as you add more images.