Inspiration
Research groups usually maintain a collection of all the publications of their group members on their website, but it's always a lot of work to create the correct reference and to check the update periodically, so I wrote a python program that automate this process
What it does
By entering the author's first name and last name in a graphic interface, the program runs a scrawler that gathers all the publication pages of that author from the database, parse and re-order the information to create AGU style references, and check if it's already in the pool of included publications. In the end, the program output a list of AGU references that need to be included in to a .txt file.
How I built it
I used Scrapy for the scrawler, and used Tkinter for a (small) graphic interface.
Challenges I ran into
The database actually doesn't follow any reference style, so I have to parse and re-order the information from multiple fields on the website to create AGU style references.
Accomplishments that I'm proud of
I have never tried to create graphic interface before, so I'm pretty proud that I did it.
What I learned
Python, Scrapy, Tkinter...
What's next for Check Publication
I can include other styles as an option on the graphic interface. And also include other database for a more complete check.
Built With
- python
- scrapy
Log in or sign up for Devpost to join the conversation.