Motivation
According to the World Health Organization, there are 3.3 million alcohol related deaths per year. In the U.S. alone, six people die of alcohol poisoning per day. Alcohol poisoning is often caused by either negligence, or lack of information on how to safely pour your drinks. Our team’s app, DrinkMate, intends to make drinking a safer and more enjoyable experience. People often don’t know how to properly make a mixed drink. Monitoring alcohol consumption levels is also more difficult with mixed drinks than it is with canned drinks. DrinkMate has the ability to guide users with augmented reality to precisely construct mixed drinks.
Usage
DrinkMate is designed to be easy to use, and quick to improve your cocktail game. Upon opening the app, users simply select a drink that they would like to make and point their phone at any clear drinking glass. With a tap on the screen, DrinkMate detects the presence and volume of a cup in view of the camera, and displays lines onto the class to show the user where to stop filling each ingredient. The user is free to move in place (change the camera angle). As long as they keep the phone pointed at the glass, the lines will stay oriented on the surface of the glass no matter the angle. Since the user can move their phone around, they can watch the augmented reality view of their drink while they pour. DrinkMate tells them exactly when to top pouring each ingredient. By doing this, users are drinking both responsibly and accurately.
Methodology
DrinkMate uses the augmented reality capabilities of the iPhone to make mixed drinks safer and better tasting. In a technical sense, DrinkMate is built in swift for iOS, and is supported by an Angolian drink database and python-based neural network to make its augmented reality possible.
Challenges
We came in with big ambitions for this project. We ended up finding some libraries that looked pretty good for the task that we wanted to do. One of the libraries that we ended up having to abandon was the neural network classifier that could identify inanimate objects. We were hoping to adapt this to be able to identify the cup that's going to be filled. Since the library wasn't as flexible as we were hoping, and since machine learning on a phone is difficult, we had to abandon it for a simpler solution. There was also very few data sets of cups and glasses, and we had to train our own models and convert them to a format that is usable for iOS.
Accomplishments
Live object (cup) detection
Learning Outcomes
Members of our team learned how to: Use Python to cntrol a Rumba How to better use Flask to manage a server backend Learned apple app development from scratch Learned new concepts in Machine Learning and Augmented Reality
Future work for DrinkMate
Given more time, we would like to implement an actual object detection and tracking algorithm that is completely automatic and even more streamlined than the current version. We would also like to add a "drink swipe" functionality, where users can randomize the drink they choose, and more of drink statistics would be shown on the screen.
Log in or sign up for Devpost to join the conversation.