Inspiration

Creating a robots.txt file by hand is tedious, especially if the person creating the file does not remember all the endpoints of their website. Automating this process saves a significant amount of time and energy.

What it does

Creates a robots.txt file through the command line by either crawling the entire website of interest or by manually providing a user input.

How we built it

We collaborated on this project and wrote it in Python. We all took on different roles to collectively produce the final result we have now.

Challenges we ran into

We had to change the design several times because we realized that some ideas we initially thought of was beyond the scope of the project.

Accomplishments that we're proud of

We made a python package that is on the test server so that we use pip to install it. We also had a good distribution of work.

What we learned

Learned how to work together on a challenging yet motivating problem, learned to plan out our time in order to finish by a deadline, learned to change plans along the way when certain problems proved to be unfeasible.

What's next for Robottxt

Improved user interface and full user control over the state of each endpoint.

Built With

Share this project:

Updates