'''It was inefficient to look through the dining hall websites to find our favorite meals, so we wanted to be able to display when our favorite meals are on the menu.'''

What it does

'''Scrapes the three UIowa dining hall webpages and stores the menu items in a database. Then pulls data from the database and shows it on a web application. The web application is interactive allowing users to select which dining hall they wish to see and showing every food item available.'''

How we built it

We used webchromedriver from selenium to parse the dining hall pages. We used sqlite to store data obtained through the web scraping. We also used Node JS and express to run our web server. We used node package manager for external libraries. Used HTML/CSS/Javascript for front end interactivity.

Challenges we ran into

Trouble fetching data from database when trying to display items on the front end.

Accomplishments that we're proud of

Successfully learning the basics of web scraping and ultimately collecting data off of multiple pages of a website.

What we learned

How to use selenium, chromedriver, and display data in tables on the front-end.

What's next for Plate Scraper

Being able to search foods from our database and choose favorites. Then display your favorites on that are on the menu of the current day.

Share this project: