Being a team of four high school students going into grade 12, we've been very anxious trying to figure out which university programs to apply to and whether or not we even have a chance of getting in. All universities usually list their average mark requirements on their website, but in the case of some programs (e.g. Waterloo's course for Computer Science), the listing isn't necessarily the most accurate. It deals with the minimum average, not the average mark range that the university actually accepts. That's why we developed UAP.
What it does
UAP's purpose is very simple - the user enters their chosen school and program, as well as their overall average mark from high school. The data is sent off to an external server, where our AI uses already collected information on admission averages from past years (from CUDO: link) to determine your chance of actually getting into the program. It'll then give you that percentage chance right back on the website!
That is the intended purpose, but we ran short on time before we could get the AI and web scraper to work with the website.
How we built it
We used many different tools to build UAP. The website used HTML, as well as bootstrap for styling. Flask was used to connect the site to the external server. All of the scripts (including the AI model we built using scikit-learn) were made using python and a plethora of libraries.
Challenges we ran into
The whole project was a challenge from the beginning, but a few particular issues took up the most time. Ideation was probably the biggest. For the first day, we were throwing around ideas between the workshops trying to figure out what we wanted to do, but didn't quite settle on it until the next morning. Beyond the initial hurdles, we ran into problems with the AI. We might not have had the best grasp of the difference between machine and deep learning, so we ended up having to make some significant changes to the AI model we ended up using. The web scraper we made to get the dataset for our AI also presented its own difficulties but ultimately ended up being sorted out before the end. In the end, we weren't quite able to merge a few of the components so the end product doesn't really give predictions from the website despite having the functionality to take and work with the values.
Accomplishments that we're proud of
Over everything else, we're proud that we were able to make and submit the parts of the project we were able to within the time we had and the skillset we brought. We might not have a perfectly finished product, but we learned a ton more than we'd initially thought.
What we learned
This whole hackathon was a huge learning experience for our team. We went into it with little knowledge of how to create our own machine learning algorithms or how to make the front/back ends of a website, so getting to familiarize ourselves with this new set of tools was a wonderful bonus to the already amazing experience of Ignition Hacks - it was also great to get more clarification on the differences between machine and deep learning for any future projects.
What's next for UAP - The University Acceptance Predictor
The absolute first thing to do would be combining the AI, the website and the web scraper together so that the whole thing actually works as we had originally intended. Additionally, if we are to continue developing UAP, we're going to need a much bigger dataset. The set we used during the hackathon was enough, though still limited in how many universities and relevant programs it was able to provide average admission marks for. Getting professional server side hosting and potentially rebuilding the whole project with different tools (node.js, etc.) are among the many other changes we would need to make going forward.
All in all, thank you for the amazing competition! These last 36 hours wouldn't have been as thrilling without the efforts of all the staff, judges, mentors, sponsors and participants.
- The UAP development team
Log in or sign up for Devpost to join the conversation.