Why am I excited about the new Search Console URL Inspection API?
I found myself praising Google Search Console over the past year.
It has always been an extremely useful resource and I encourage any webmaster to add it to their regular stops, but I wanted to take my hat off, once again, having discovered the new Search Console URL Inspection API. While this doesn’t actually change the “connected” experience of using Search Console, it’s a welcome boost for any tool that uses Search Console data.
For non-geeks, ‘API’ stands for Application Programming Interface. You can think of an API as a window to third-party data or as a bridge for applications to communicate with each other. The new URL Inspection API allows software applications to use Search Console data related to specific URLs. In many cases, this will greatly improve the value of the data displayed in these applications.
One such software application is the Screaming Frog crawler, which is one of my favorites. The latest version (16.6 – big enough to get its own romantic name “Romeo”) adds an extremely useful drop-down list to the Search Console tab:
What does this drop-down menu allow you to do?
Each of these options allows you to dig deeper into specific issues that may be affecting your performance in organic search.
- Clicks above 0: A simple one to start with – this filter will show all URLs that have had at least one click in Google’s SERPs.
- No GSC data: This filter will show all URLs found by the Screaming Frog crawler that do not return any Search Console data. The most likely reason for this is that the URLs haven’t had a single impression on Google’s SERPs, so it’s worth seeing if there are any important pages that need a bit more love. for referencing.
- Not indexable with GSC data: This filter will show URLs that do not return 200 status codes but have been crawled and indexed by Google.
- The URL is not on Google: This will display any URL that the crawler has found but is not indexed by Google. This is important because you will not appear in search results if the page is not indexed. This can, of course, be intentional and this list *should* include all URLs that have a “noindex” tag, but it’s a handy report for identifying URLs that aren’t appearing in Google’s search results.
- Unindexed indexable URL: This is similar to above, but will exclude any URLs that should not be indexed, so it’s arguably a more useful report for identifying problematic pages. Essentially, it will display all of the URLs that the Screaming Frog crawler finds that should be indexed by Google, but currently aren’t.
- The URL is on Google, but has problems: This will highlight URLs that are indexed and should return in Google’s search results, but have warnings associated with that URL. These warnings will include mobile usability, AMP, or rich results issues that may impact performance.
- Canonical declared by the unselected user: This will show cases where Google ignored the canonical tag you declared and chose to index a different URL. Remember that canonical tags are advice rather than absolute guidelines and Google can get it wrong, so it’s worth seeing where your guidelines are being ignored.
- The page is not mobile friendly: It’s pretty self-explanatory – it will show all URLs that are flagged as having issues on mobile devices.
- The AMP URL is invalid: If you use Accelerated Mobile Pages (AMP), you’ll know that Google can be a bit pricey with validation. This report will quickly highlight where issues are occurring and any URLs that won’t appear in search results.
- Invalid rich result: This will show all URLs with errors with rich results enhancements. These errors will most likely prevent rich results from displaying in Google search results, so they should be investigated.
The first three filters in this list aren’t new, but the others were all added through the URL Inspection API.
I generally like to stay in the Screaming Frog app and export specific issues, but there’s a handy report you can use to summarize all the data for individual URLs, which is available in the ‘Bulk Export’ menu:
This will export a spreadsheet including all URLs and show any errors/warnings that need to be investigated.
What’s the catch?
There’s really no catch – it’s really useful to have this data available to pull into other apps – but it is a data quota.
You are limited to 2,000 total requests per day and there is an accelerator to limit the number of API requests to 600 per minute. The speed limit shouldn’t be a problem, but the 2k daily limit is an obvious issue for large sites and you’ll need to spread requests over several days if you have a site with over two thousand URLs in Search Console. An irritation, but not the end of the world and I’m really happy to see this data available in Screaming Frog.
Thank you Google for another great improvement to the powerful Search Console. If you want to keep an eye out for new features, it’s worth reading https://developers.google.com/search/blog.