How to Pull API Data Into Excel

Pulling API data into Excel is quite a bit easier than what I expected. The most difficult part is understanding how to build the URL you will use to request the data.

In my case, I have been working with the Sportradar API to analyze Husker football data. The first step for me was to get an API key that allows me to get back data from Sportsradar.  Once I had that, it was simply a matter of taking these simple steps:

  1. Open a new Excel workbook
  2. Click on the Data tab in Excel
  3. Click From the Web
  4. Enter the API URL

It’s as easy as that. Unless you have come across an error you should have your data tables listed in Excel.

If you are interested in using the Sportsradar API, checkout

4 thoughts on “How to Pull API Data Into Excel”

  1. To bring refreshable Access data into Excel, you can create a connection to the Access database and retrieve all of the data from a table or query. For example, you may want to update an Excel summary budget report that you distribute every month so that it contains the current month’s data.

    1. Interesting idea! Will definitely need to try that out. Do you mind expanding on scenarios where this would make most sense? For example, I would assume one would not want to use Excel as a database so Access could really come in handy to back it up. Thanks!

      1. I agree, Excel is probably not the best database solutions. In some scenarios where there is a small amount of data it could work perfectly fine as a database, but when you start working with very large data sets, you are going to want to look at Access or server database, like mySQL or some other equivalent.

        So it’s more likely that someone would pull data from an API into Excel in order to do data analysis versus using it as a database. And it does a good job of connecting to a number of sources you might need to use in order to accomplish this.

        I have been working a lot lately on reporting using Excel and APIs, so hope to create a post on that soon.

        Thanks for writing!

  2. Hi Mark, Really appreciate this post. Always great to see R projects that are applicable to SEO. Storing the GSC data in CSVs completely makes sense. Piggy-backing off of what Seldata said above, just checking to see if at any point you might have published instructions for building the Shiny app above and how it connects to what I assume is BigQuery database that you mentioned above. I know that’s a lot to ask, but thought I’d check in order to get a head start! Thanks! Jeff

Leave a Reply

Your email address will not be published. Required fields are marked *