Inspiration

As our team made plans to go back home for fall break, we started booking tickets for various modes of transportation. However, we came to the sullen realization of the carbon footprint our entire student body was creating as a result. The scope of this problem becomes immense once we consider everyone traveling home for every holiday season.

We observed that general holiday travel created an unnecessarily large carbon footprint which could have been mitigated with slightly longer travel durations. Hence, we came together to create a platform which enables users to create the most ecological travel plan suited to their preferences.

What it does

EcoTravel empowers users to discover the fastest routes to their destination with the smallest carbon footprint possible. Our novel deterministic ranking algorithm is able to generate a list of travel routes via buses and trains while ensuring the smallest travel time possible.

How we built it

To create the highest quality user experience, we used Velo by Wix to ensure a stable yet highly interactive interface allowing users to seamlessly search for the optimal travel plan. Velo enabled us to interact with our middle layer, the Google Cloud Platform (GCP) Load Balancer, to pull and push requests from our backend. Finally, our backend is composed of the Query Service, Fetch Service (powered by the GCP Compute Engine), and our ranker (powered by the GCP App Engine). The query service processes the input data from the user (which is the date, start, and end destination of the journey). Using this data, the Fetch Service will then scrape data from various travel websites in order to retrieve details of tickets available. The GCP Load Balancer will then communicate with our ranker through a Slug Manager which will ping the ranker to read whether or not it’s ranking travel options. Once the ranker has successfully ranked all possible routes, they would then be returned to the front-end for the user to review. After this, the user can filter out the possible routes (by price, duration etc.).

Challenges we ran into

As we published our front-end to Velo and pointed it to the domain, it took an extremely long time for the website to propagate through all the DNS. As a result, our progress was halted for an extended period of time which made us inefficient.

Furthermore, scraping data from travel sites (Amtrak, Greyhound Buses etc.) proved to be difficult as we had to convert all data retrieved into a uniform format in order to process it. Since none of these sites provided public APIs, we needed to reverse engineer their private APIs. In many cases, these sites used anti-bot protection methods that we needed to bypass. For instance, Amtrak employed a novel method of “tarpitting” the user once it detected a bot. In this case, Akamai Bot Manager (Amtrak’s defense system) would cause known bot requests to timeout and take 30 seconds, slowing down our scraper. This proved far more difficult than a standard 403 Access Forbidden message, as the latter allows us to easily retry the request within seconds.

In order to bypass Akamai’s Bot Manager (on Amtrak) and PerimeterX Bot Defender (Skyscanner), we had to match all the aspects of a browser request — copying the ciphers used in the TLS client hello handshake, matching the order of the pseudo-headers, and matching the http2 settings of Chrome. This was a challenge, given that most network libraries are not built to handle changing lower-level details. Additionally, we had to submit “sensor data” for Akamai, which was an encrypted payload containing various browser fields. Our method of bypassing PerimeterX was novel and involved pretending to be an internal Skyscanner API. The site would not block its own internal API, so we were able to pass PerimeterX protection without solving any payloads.

Accomplishments that we're proud of

Without any prior knowledge of Velo, we were able to create an elegant yet simple user interface whilst being able to seamlessly integrate it with our back-end hosted on the Google Cloud Platform.

We were able to successfully scrape travel sites and bypass their anti-scraping protections. We bypassed Akamai Bot Manager and PerimeterX Bot Defender, which are hundred-million-dollar products devoted solely to preventing scraping of websites.

What we learned

As first time Hackathon participants, there was a great learning curve in collaborating with other programmers via GitHub. We learned how to cover for each others' weaknesses and how to debug code as a group to be more efficient.

From a technical standpoint, we learned how to integrate various APIs, layers, and servers to create a full stack web application.

What's next for EcoTravel

After establishing a user management system, our first step would be to become registered partners with the travel services we are aggregating in order to earn commissions. With this in place, we will then be able to incentivize users to use more eco-friendly modes of transportation by awarding them EcoPoints which can be redeemed to achieve discounts in further bookings made with our platform.

Even though our platform’s website is already mobile-friendly, we aim to make an app in order to make our platform more accessible to users on the move.

Share this project:

Updates