Inspiration
There are many underclassmen in college that do not use up their entire meal plans when they come to college. There are also many upperclassmen in college who want to save money and get "swiped" food for free. As VCU students, Carytown is right around the corner, and is a melting pot of many different types of food. Sometimes, you see a picture of gorgeous looking food and say, "I want to know what this is, and where I can find it." Our app will not only link college students with each other to share swipes with each other, but also allow for the identification of different types of food, allowing for users to find out where they can find that food in their local area.
What it does
This app utilizes the current location of the user and matches him or her with people who are willing to swipe them within a set radius. From there, the user will have a push notification that states that he or she found a match, and allows for the ease of matching users with each other. This app also allows for text to be read from an image and stored in a database, which allows for the user to keep track of where he or she has gone to eat in the past through receipts. This app also allows for different foods to be identified using a camera on a smartphone, and compared against a database using machine-learning. After time, the machine learning aspect of this program will start to learn about the user's tastes and eating habits, and start to offer suggestions on food.
Here is an example of the food identification:
John is walking down the street holding a hamburger. Adam walks by and sees the appetizing looking food. Adam has no idea that the food is actually a hamburger, but really wants to find out what it is and where he can find it. Adam takes a picture of the hamburger using Swifey (app) and identifies it as a "hamburger." Adam now knows he wants a "hamburger," and can be referenced to nearby restaurants that have it.
How I built it
Android Studio interfaced with Firebase for hosting data. Machine learning for identifying food was inspired by example using CalorieMama.api on GitHub. Optical Character Recognition (OCR) for back-camera of smartphone was used for identifying text in images. All identified text could be stored in Firebase. User login information for the app was also hosted on Firebase.
Challenges I ran into
Interfacing Firebase with android studio, programming OCR capability in Android Studio, GeoLocation in Firebase, Navigation Drawers, implementation of Machine Learning.
Accomplishments that I'm proud of
Implementation of machine learning and OCR capabilities into android studio. Learning Firebase and implementing it to communicate with android studio.
What I learned
System.out.println("Hello World.");
But actually, we learned how to map out and plan a mobile app, as well as figure out how to code different aspects of the app with machine learning and OCR, as well as referencing databases.
What's next for Swifey
Due to insufficient time, we were not able to completely integrate all aspects of the app. Looking forward, we would like to have Swifey completely integrated and operational. The end goal is to have Swifey become available for download on iOS and Android devices for college students. We would like to start with our college (VCU), and then hopefully move on to other colleges as well.
Log in or sign up for Devpost to join the conversation.