fbpx

Google now shows severity impact of resources you’re blocking them from

Google’s ability to crawl and index JavaScript used to be quite limited. This meant Google couldn’t fully crawl websites that utilised JavaScript for things such as navigation drop down menus. In addition Google’s limitations with JavaScript were exploited by some spammers to hide content in JavaScript.

These issues, plus the increasing adoption of technologies like AJAX, mean that Google has worked hard to progressively improve its ability to crawl JavaScript – something it advised it was fully able to do several years back.

However despite this, an issue that’s continued to cause Google problems has been the fact that many websites exclude Google from accessing various web resources such as directories containing JavaScript. This has limited Google’s ability to crawl and render websites as they are seen by end users. Not only does this constrain Google’s ability to interpret websites correctly, it also leaves the door open for spammers to show Google one thing and users another.

Over the last year or so Google’s made a point of notifying website owners via Google Webmaster Tools / Google Search Console when they they are blocking Google’s access to resources and given warning that doing so can detrimental impact rankings.

Seems this hasn’t been enough, so in the last week Google has started to provide more detailed warnings. Google has now added a new feature to the Fetch & Render tool in Google Search Console that shows the severity of blocked content. This shows you the importance of any resource being blocked, such as an image, script, CSS file, or JavaScript file.

The severity column in the Fetch & Render Tool shows High, Medium and Low warnings for each resource that is blocked, as the example below shows.

google search console severity check

Why should you care?

The reason you should care is that Google’s made it clear that blocking their access to resources on your website can detrimentally impact how well your site ranks. And lower rankings mean less traffic and conversions – so a pretty compelling reason to conform.

Ideally you shouldn’t be blocking any resources from GoogleBot, but if they are then blocked resources should be limited to those that have low severity status.


Click here for more search marketing news.

Subscribe

The latest news about web marketing, SEO, PPC Advertising & Web Analytics. But only the stuff that matters from a New Zealand perspective. Delivered to your inbox each Monday.

If you found this useful, please tell your friends.

About the Author Mark Sceats

Mark is a Partner and Senior Consultant at SureFire which he founded back in 2002. Prior to establishing SureFire he worked for KPMG Consulting. Today Mark heads up SEO, embracing the challenges that can come with complex website implementations. Outside of work, his interests beyond his family are running, snowsports, diving and fishing (badly).

follow me on: