There are multiple experiences that inspired me to make this extension. I have tried my best to write about them in brief.

When I started using APIs, I tried following online tutorials but most of the APIs these videos were either not available or were not free. I also have been running into problems while looking for a particular API that suits my purpose. Even at hackathons, I have seen a lot of developers using APIs to develop their product so I believe that it would be really helpful to have access to the different APIs that are available.

Yesterday, my friend told me about Robinhood's API that would allow developers to build their own trading applications. I also came across the Spotify API that even allows one to integrate Spotify Analytics. I am sure that there are many such APIs available and it is important for developers to know about them.

What it does

It is an extension which calls an API to get different APIs associated with the category user wants. It prompts the user to enter the API category in an input box. The APIs related to that category are scraped and returned by the API and presented in a json file by the extension. The user can also choose the file name using the input box that shows up. Sometimes the data loading process takes a lot of time and that leads to a "Please try again" message. Trying the same commands again would load the data properly. Example: If the user wants to use Twilio APIs then a simple Twilio Phone search would provide the user with a number of different APIs by Twilio. The user can also search for other APIs like AWS APIs, Azure APIs and many more.

How I built it

I built it by first scraping web data using python. I scraped data and built a REST API using Flask that serves that data. I then hosted that on Heroku and called that information in the extension. The extension creates a new file and stores the information there. It also contains the link to the API's documentation which could help a developer learn more about the API.

Challenges I ran into

The major challenge was when I was trying to scrape the data as I had to make sure that the result is returned quickly and the most important chunks of data are returned. I believe that just presenting the API with its description wouldn't have helped so I had to scrape the url of the API's documentation which was nested on different pages. I had to follow those pages to scrape the url. I later ran into problems with the API taking a lot of time in returning data which was resolved by reducing the manipulation my API code had to do.

What I learned

I learnt how scraping could come handy in increasing the access people have to different resources viable online. On the technical side, I learnt how to make API calls inside a Visual Code Extension.

What's Next

I hope to make it easier for developers to integrate APIs in their applications using this extension. Different APIs have different endpoints but I hope to use this extension in helping the user to integrate an API of their choice. I also hope to get rid of data loading timeout problems soon.

Share this project: