Skip to Content

The author

Tom Stanford

SEO Strategist

On Monday morning Google's UK Webmaster Trends Analyst, Pierre Far, revealed that the technical guidelines found on Webmaster Tools had been updated to explicitly outline that blocking elements of CSS, JavaScript or HTML could have a negative impact on your organic visibility.

Google have already made clear the implications of disallowing the search engine to certain site elements:

‘To help Google fully understand your site's contents, allow all of your site's assets, such as CSS and JavaScript files, to be crawled. Disallowing crawling of JavaScript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.’

­Pierre also commented; ‘By blocking crawling of CSS and JS, you're actively harming the indexing of your pages.’

Basically, what Google are saying is that you should allow them access to all JavaScript, CSS or HTML page elements as there may be valuable content which helps your site’s organic visibility. Without full access, you could be potentially harming your organic visibility, Pierre went on to say; ‘By blocking crawling of CSS and JS, you're actively harming the indexing of your pages.  It's the easiest SEO you can do today’.

We strongly recommend checking that the elements of HTML, CSS or JavaScript on your site are crawlable for search engines and that nothing is blocked in your site’s robots.txt file - unless it’s absolutely necessary.

To check how Google ‘views’ your site, you can use the Fetch as Google tool. This essentially allows you to check for errors and render the page to see what Google sees once they have crawled a page of your site and viewed the final render.

This update would appear as another sign that Google’s ability to crawl websites is becoming increasingly intelligent and sophisticated. If you want to find out more, just get in touch.