Ever since the proliferation of the Internet in the late 1990s, Internet Service Providers (ISPs) have made spreading the blessings of internet across the globe their mission. Even though internet connection seems largely ordinary to many of us, there are still locations, specifically agricultural locations, that need a stronger and better wireless connection to provide consumers with the best produce and agricultural services as possible. We, the creators of AgriFind, believe in WaveDirect's mission statement of providing "Internet for Everyone." We understand its importance, and we wanted to present a directory in a clean and organized fashion for WaveDirect's convenience.
What it does
We built a directory in our website to assist in locating farms, greenhouses and other agricultural businesses. This would provide key information about the business and how to contact them. This is all contained on a clean dashboard and an excellent user interface to allow for both simplicity and efficiency. It targets businesses Chatham-Kent to Windsor-Essex inclusive regions.
How we built it
First, we utilized a python program that made use of the Google Places API to scrape the web for information about agricultural businesses in the given region. This would return us key information including the business names, addresses, contact info, types, and websites which would then be returned in JSON format. We didn't know what a JSON file was before this, and we had to spend over an hour trying to figure it out. Thankfully, we figured out how to piece apart the information through code and this would then be converted and adapted into one of our websites built in HTML and CSS. This site would list the businesses and the key details. But, we were not satisfied with the results and the interface on this page. Thus, we developed a directory on a Wix site to allow for a cleaner design and an improved style. We used Wix's developer interface called Velo and the tools it provided to get exactly what we wanted. This site's directory would import a CSV file that we converted from the JSON output of the initial python program. This CSV file would be uploaded to a Wix database that we used to make our directory with.
Challenges we ran into
Initially, we were interested in possible integrating CockroachDB to store our information gathered about businesses in the region. We had difficulty installing its libraries and getting it set up properly. We also had struggles when figuring out the Google Cloud API and its various uses. Our API key was not functional at times and we had to do quite a bit of troubleshooting in order to perfect our code utilizing the API's functions. The Google Cloud API we utilized to scrape the web for information also had its limits. It didn't give us some of the information we wanted such as business emails, social media, and descriptions. We had to manually go in and add this missing data. Storing the information from the web scraper in the database in an organized manner was difficult.
What we learned
In terms of educational purposes, this hackathon was amazing for our team. We are high school juniors just beginning our journey into computer science. Through some hard work and seemingly endless debugging, we made a finished product that we are truly proud of. We learnt so many new things and it was an enriching experience for us. As a team, we had to figure out how to use the Google Places API, how to parse through the JSON output, and how to initialize and utilize a database. We hope to apply this knowledge to future hackathons and projects.
What's next for AgriFind
- Seamless integration between the backend and front end of the program. This means including the program to scrape the internet for information and compiling it into the database, into the main website itself. Right now this process is disconnected with the main website needing a CSV data file to be imported in order to display the directory. Ideally this would happen in real time.
- Allow for the user to not only use the website for the Winsor/Chatham regions but to any regions that they input.
- As previously stated in the challenges section, we had to manually add in some data regarding businesses. For AgriFind's future we hope to use a web scraper that could get us this data efficiently without the use of slow manual labor. This would put us closer to what we envision for this product.
- We also added a domain name unlinked with our actual website, but it is for the Domain.com name challenge :)