Inspiration

Having #1 dining on the campus, we have so many options to choose from in each dining hall every day. The process of deciding what to eat, where to eat, and when to eat is really frustrating. We all have certain preferences in our diet as in what kind of food we like, dislike, our allergies and much more stuff. Why not use all these info combined with whats on the menu each day to give us personalized recommendations every morning to have a better dining experience. All of our teammates spends a fixed amount of time everyday on checking the menus across dining halls to decide where to eat. With this newsletter we believe we can make everyone's life a little easier!

What it does

Based on the user’s preferences, our program hands out daily newsletter to your e-mail about which canteens have foods you like by GenAI and to recommend foods they might like. We also filter out allergens and dietary restrictions to create even more personalized and secure newsletter!

How we built it

We are using SQLite database to keep track of our users' preferences, and the dining hall menus for the day. In our database, we store every food option in all 4 major dining halls on the campus. We also store what food is served at what time of the day, what are the dietary restrictions of the foods and if they have any allergens. We scrape the menus from dining halls using BeautifulSoup. While users are signing up for our newsletter, we also ask our users to share their favorite foods on our website, so we can personalize their newsletter using GPT-4o-Mini. Other than that, we also filter out personalized newsletter according to user's dietary restrictions. We use Flask to deploy backend applications. Besides, we also use a script to send customized emails every morning which we wrote with Python. Lastly, the website to sign up for the newsletter is built using sveltekit. We also used Figma on our UI design.

Challenges we ran into

Biggest challenge we faced was to integrate backend to work on servers. To tackle this, we used AWS EC2 servers to deploy our program. We first used Selenium to scrape data, but it became a problem when we moved our program to AWS servers, so we learned to use BeautifulSoup instead.

Accomplishments that we're proud of

Everything! we were clueless on 90% of this stuff and figured out almost all of them during the event.

What we learned

We learned how to deploy websites, sveltekit (a whole new framework for us!), data scrapping with BeautifulSoup, and how to use AWS.

What's next for YuMass

We are planning on lunching the website for the public so everyone can use a personalized dinning hall assistant. We are also planning to add personalized Grab-n-Go recommendations in the near future.

Share this project:

Updates