On Monday 27th October Google published revised guidelines to ask for site owners to modify their robots.txt configuration to allow Google to access additional files that are used to display your website to visitors (these are called CSS and Javascript files) and previously weren’t needed for Google to correctly index your website. Google’s new advice is:
For optimal rendering and indexing, our new guideline specifies that you should allow Googlebot access to the JavaScript, CSS, and image files that your pages use. This provides you optimal rendering and indexing for your site. Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.
Previously, Google’s site indexing systems more closely resembled older text-only web browsers, focusing on the text content of your website rather than the website as a while, and that was reflected in their guidelines to webmasters and designers. Now, with Google’s change to make indexing based on page rendering changes may be needed to your website to allow Google access to files they previously didn’t need.
Google added that websites should also eliminate unnecessary downloads, optimize the serving of CSS and JavaScript, and make sure your server can handle the additional load for serving the files to Googlebot.
If you’re an O’Brien Media customer we’ll be sending out an email shortly advising you of the changes, in the meantime (or if you’re not an O’Brien Media customer) please feel free to get in touch if you would like us to assist with making your website compliant with Google’s revised guidelines.