Chuyển đến nội dung chính

How to Query the Google Search Console API

Posted by briangormanh

If you’ve been an SEO for even a short time, you’re likely familiar with Google Search Console (GSC). It’s a valuable tool for getting information about your website and its performance in organic search. That said, it does have its limitations.

In this article, you’ll learn how to get better-connected data out of Google Search Console as well as increase the size of your exports by 400%.

Google Search Console limitations

While GSC has a number of sections, we’ll be focusing on the “Performance” report. From the GSC dashboard, there are two ways you can access this report:

Once inside the “Performance” report, data for queries and pages can be accessed:

This reveals one of the issues with GSC: Query and page data is separated.

In other words, if I want to see the queries a specific page is ranking for, I have to first click “Pages,” select the page, and then click “back” to “Queries.” It’s a very cumbersome experience.

The other (two-part) issue is with exporting:

  • Performance data for queries and pages must be exported separately.
  • Exports are limited to 1,000 rows.

We’ll look to solve these issues by utilizing the GSC API.

What is the Google Search Console API?

Now we know the GSC user interface does have limitations: Connecting query data with page data is tricky, and exports are limited.

If the GSC UI represents the factory default, the GSC API represents our custom settings. It takes a bit more effort, but gives us more control and opens up more possibilities (at least in the realm of query and page data).

The GSC API is a way for us to connect to the data within our account, make more customized requests, and get more customized output. We can even bypass those factory default settings like exports limited to 1,000 rows, for instance.

Why use it?

Remember how I said earlier that query and page data is separated in the “vanilla” GSC UI? Well, with the API, we can connect query data with the page that query ranks for, so no more clicking back and forth and waiting for things to load.

Additionally, we saw that exports are limited to 1,000 rows. With the API, we can request up to 5,000 rows, an increase of 400%!

So let’s hook in, make our request, and get back a more robust and meaningful data set.

Setup

Log in to the appropriate GSC account on this page (upper right corner). For instance, if my website is example.com and I can view that Search Console account under admin@email.com, that’s the account I’ll sign into.

Enter the URL of the appropriate GSC account:

Set up your request:

  1. Set startDate. This should be formatted as: YYYY-MM-DD.
  2. Set endDate.
  3. Set dimensions. A dimension can be:
      • query
      • page
      • device
      • and/or country
  4. Set filters (optional). A filter must include:
      • dimension (a dimension can be: query, page, device, or country)
      • operator (an operator can be: contains, notContains, equals, notEquals)
      • expression (an expression can be any value associated with the dimensions)
  5. Set the rowLimit. With the GSC API, you can request up to 5,000!

The page shared in step one makes all of this setup pretty easy, but it can be tedious and even confusing for some. I’ve done all the fussing for you and have created JSON you can edit quickly and easily to get the API return you’d like.

Unfiltered request

The following request will be unfiltered. We’ll set our preferred dates, dimensions, and a row limit, and then make our request.

The order in which you place your dimensions is the order in which they’ll be returned.

The API will return data for desktop, mobile, and tablet, separated out. The numbers you see in the GSC user interface — clicks, for instance — are an aggregate of all three (unless you apply device filtering).

Remember, your dimensions can also include “country” if you’d like.

{

"startDate": "2019-11-01",

"endDate": "2020-01-31",

"dimensions":

[

"query",

"page",

"device"

],

"rowLimit": 3000

}

Filtered request

This version of our request will include filters in order to be more specific about what is returned.

Filters are stated as dimension/operator/expression. Here are some examples to show what’s possible:

  • query contains go fish digital
  • page equals https://ift.tt/1WUOXHA
  • device notContains tablet

It looks like you can only apply one filter per dimension, just like in the normal GSC user interface, but if you know differently, let us know in the comments!

{

"startDate": "2019-11-01",

"endDate": "2020-01-31",

"dimensions":

[

"query",

"page",

"device"

],

"dimensionFilterGroups":

[

{

"filters":

[

{

"dimension": "device",

"operator": "notContains",

"expression": "tablet"

}

]

}

],

"rowLimit": 3000

}

Choose a template, unfiltered or filtered, and fill in your custom values (anything after a colon should be updated as your own value, unless you like my presets).

Execute the request

So there you have it! Two request templates for you to choose from and edit to your liking. Now it’s time to make the request. Click into the “Request body”, select all, and paste in your custom JSON:

This is where you could manually set up your request keys and values, but as I stated earlier, this can be tedious and a little confusing, so I’ve done that work for you.

Scroll down and click “Execute.” You may be prompted to sign-in here as well.

If everything was entered correctly and the request could be satisfied, the API will return your data. If you get an error, audit your request first, then any other steps and inputs if necessary.

Click into the box in the lower right (this is the response from the API), select all, and copy the information.

Convert from JSON to CSV

Excel or Sheets will be a much better way to work with the data, so let’s convert our JSON output to CSV.

Use a converter like this one and paste in your JSON output. You can now export a CSV. Update your column headers as desired.

Query your own data

Most SEOs are pretty comfortable in Excel, so you can now query your request output any way you’d like.

One of the most common tasks performed is looking for data associated with a specific set of pages. This is done by adding a sheet with your page set and using VLOOKUP to indicate a match.

The API output being in a spreadsheet also allows for the most common actions in Excel like sorting, filtering, and chart creation.

Get more out of Google Search Console

GSC offers important data for SEOs, and the GSC API output offers not only more of that data, but in a format that is far less cumbersome and more cohesive.

Today, we overcame two obstacles we often face in the standard GSC user interface: the query/page connection and limited exports. My hope is that utilizing the Google Search Console API will take your analyses and insights to the next level.

While my JSON templates will cover the most common scenarios in terms of what you’ll be interested in requesting, Google does offer documentation that covers a bit more ground if you’re interested.

Do you have another way of using the GSC API? Is there another API you commonly use as an SEO? Let me know in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Nhận xét

Bài đăng phổ biến từ blog này

What Google's GDPR Compliance Efforts Mean for Your Data: Two Urgent Actions

Posted by willcritchlow It should be quite obvious for anyone that knows me that I’m not a lawyer, and therefore that what follows is not legal advice. For anyone who doesn’t know me: I’m not a lawyer, I’m certainly not your lawyer, and what follows is definitely not legal advice. With that out of the way, I wanted to give you some bits of information that might feed into your GDPR planning, as they come up more from the marketing side than the pure legal interpretation of your obligations and responsibilities under this new legislation. While most legal departments will be considering the direct impacts of the GDPR on their own operations, many might miss the impacts that other companies’ (namely, in this case, Google’s) compliance actions have on your data. But I might be getting a bit ahead of myself: it’s quite possible that not all of you know what the GDPR is, and why or whether you should care. If you do know what it is, and you just want to get to my opinions, go ahead and ...

Optimizing AngularJS Single-Page Applications for Googlebot Crawlers

Posted by jrridley It’s almost certain that you’ve encountered AngularJS on the web somewhere, even if you weren’t aware of it at the time. Here’s a list of just a few sites using Angular: Upwork.com Freelancer.com Udemy.com Youtube.com Any of those look familiar? If so, it’s because AngularJS is taking over the Internet. There’s a good reason for that: Angular- and other React-style frameworks make for a better user and developer experience on a site. For background, AngularJS and ReactJS are part of a web design movement called single-page applications, or SPAs . While a traditional website loads each individual page as the user navigates the site, including calls to the server and cache, loading resources, and rendering the page, SPAs cut out much of the back-end activity by loading the entire site when a user first lands on a page. Instead of loading a new page each time you click on a link, the site dynamically updates a single HTML page as the user interacts with the site...

How We More than Doubled Conversions & Leads for a New ICO [Case Study]

Posted by jkuria Summary We helped Repux generate 253% more leads, nearly 100% more token sales and millions of dollars in incremental revenue during their initial coin offering (ICO) by using our CRO expertise . The optimized site also helped them get meetings with some of the biggest names in the venture capital community — a big feat for a Poland-based team without the pedigree typically required (no MIT, Stanford, Ivy League, Google, Facebook, Amazon, Microsoft background). The details: Repux is a marketplace that lets small and medium businesses sell anonymized data to developers. The developers use the data to build “artificially intelligent” apps, which they then sell back to businesses. Business owners and managers use the apps to make better business decisions. Below is the original page, which linked to a dense whitepaper. We don’t know who decided that an ICO requires a long, dry whitepaper, but this seems to be the norm! This page above suffers from several issues: ...