Chicken Tenders are the best and they're hard to find on campus. Finding out that you missed chicken tender day in a dining hall can ruin your whole week, but skimming through the dining app is slow and tedious.
What it does
mTendies looks for chicken tenders on campus for you. With the press of a button, a list of dining halls that have chicken tenders today is compiled and presented to you.
How I built it
I used react-native for the app, and node.js with express, cheerio, and request to create a web scraper that looks through mDining's website for chicken tenders.
Challenges I ran into
Web scraping is slow and not something you want every user to have to wait for. I had to write a separate webscraper to run on a server and provide an api for accessing the information rather than writing it directly into the app code. Loading time was also an issue for a while (can be seen in the video posted), but I was able to dramatically improve that by having the server store the scraped information and send it upon api request, rather than scrape upon api request.
Accomplishments that I'm proud of
I wasn't sure I would be able to figure out web scraping, but it turns out it's not too bad. I'm pretty proud of the (extremely minimal) api that I generated for getting access to the dining hall information. I'm also glad I could pull off a react-native app, which seems to be pretty particular about the js you use.
What I learned
I learned a little bit about react-native development. This included push buttons, alerts, multi-page apps, navigation, and state handing with asynchronous behavior. I also learned how to scrape the web from an external server, and provide that information in a light and quick way.
What's next for mTendies
In the future I would like mTendies to send push notifications at 8 am every day letting users know exactly which dining halls have tenders that day, so they don't have to open the app. Implementing meal specifics would be good, too. On top of that, if this were going to be a deployed app I would need to utilize some server space to host my scraper. Currently I'm just creating a local server via my laptop.