As such, I concept it would be beneficial to offer an easy-to-digest evaluation of what we understand approximately Google’s plans to migrate capabilities over from the vintage version and look at the evolution of the new edition of Search Console.
I’ll aim to preserve this post updated with the modern-day releases and developments, but don’t hesitate to drop me a message if I’ve overlooked anything.
What’s Been/Being Migrated to the New Search Console?
The leading source of misunderstanding approximately the new version of the Search Console has been around how Google is coping with its transition. Now, not all of the features are being moved immediately into the new version.
Instead, Google is taking the opportunity to reconsider the tools and reports that they provide more significant benefits for customers and align greater intently with the evolving challenges that search engine optimization specialists now face.
After a few dribs and drabs of records on what is and isn’t being migrated to the new version, Google has helpfully published a publish clarifying their plans.
Here’s a rundown of Google’s plans as they stand.
Shiny New Reports & Tools
The Performance document has been around for almost a year now and turned into the first characteristic to be launched inside the beta version of Search Console.
Along with the horny new UI, the most significant advantage of the Performance file, as compared to the old Search Analytics file, is the extension of the date variety to include 16 months of information in place of three months.
The Performance report gives click, influence, CTR, and average ranking metrics at a web page, question, country, and device stage.
How could it be stepped forward?
While the Performance file has been a welcome addition, it isn’t without its faults. I count that we’ll also see updates once the migration to the new edition has been finished.
Here are some approaches that may be improved:
Trending: sixteen months of organic search statistics are lovely and all, but it’s hard to set up tendencies while you may combination statistics beyond the default daily data points. For instance, we’ve doubled organic clicks to the DeepCrawl site over the past 12 months, but the graph within the Performance record for that duration doesn’t sincerely tell that story. Being able to toggle between day by day, weekly, and month-to-month aggregation would be an available addition.
Date assessment: Setting up custom contrast date tiers is fiddly as you want to apply 4 individual calendar selectors. This might be stepped forward with the aid of using one calendar selector, the usage of something just like Google Analytics.
Filtering: The Performance file is a goldmine for organic insights. However, it’s miles repetitive and time-ingesting to have to set the crucial filters each time you re-enter Search Console. I’d definitely like to see some extra customization brought to the Search Console. It would be helpful if you were able to store filtered reviews and pin them to the Overview web page. There are quite a few costs in pulling Search Console information into dashboarding structures like Data Studio. Still, no longer everyone goes to have time for this, so Google desires to make it as smooth as possible to extract insights.
Index Coverage Report
Launched along with the Performance report on the start of the new version of Search Console, the Index Coverage record is a welcome evolution of the Crawl Errors and Index Status reviews.
The new Index Coverage document provides website-degree insights concerning crawling and indexing issues.
This record will flag issues with pages submitted in a sitemap (e.G., Submitted pages, which are 404s, blocked by way of robots.Txt, marked as noindex, etc.) in addition to presenting traits on listed (valid) pages.
What does it update?
Index Status and Crawl Errors.
How should it’s advanced?
The Index Coverage file is a useful factor to the Search Console suite as it allows you to get a pinnacle-stage view of the crawling and indexing issues without delay from Google. However, it isn’t without its boundaries:
Limited rows: When you dig into the problems flagged with the aid of Index Coverage, you’re restricted to one,000 rows of URLs. While 1,000 pages will offer higher than enough instances to diagnose and connect problems on maximum sites, it can make things elaborate for larger websites with thousands and thousands of pages.
Vague factors: Some of Google’s elements of the problems they flag can be frustratingly vague. For instance, the outline of the “Crawled – currently no longer listed” difficulty is defined as follows: “The web page changed into crawled using Google, but not indexed. It might also or may not be listed inside the future; no need to resubmit this URL for crawling.”It’s hard to apprehend what could be inflicting those pages no longer to be listed, possibly it’s an exceptional difficulty. In these instances, it’s miles often worth using the URL Inspection tool for the flagged pages to behavior a more in-depth investigation.
The URL Inspection device is a more current addition to Search Console and compliments the Index Coverage record nicely, offering granular URL-level evaluation.
The device is massively useful because it stops Google’s crawling and indexing behavior from being as much of a black box and gives search engine marketing experts an extra basis for debugging.