In the run-up to I/O 2019, Google announced that Googlebot is now running the latest version of Chrome (74) and will continue to be kept up to date, and there was much rejoicing.

For years, Googlebot and, more importantly, the Web Render Service component have been running a legacy version of Chrome on par with Chrome 41 with limited JavaScript functionality up to ES5 (the JavaScript version released in 2009).

This meant that a lot of the new features included in the latest version of JavaScript (ES6 released in 2015 and beyond) needed to be transpiled down to ES5 and anything added to Chrome post version 41 would need to be handled via polyfilling.

Transpiling is basically transforming modern JS functions in to basic JS (1 line of code might become 50) and polyfilling is the process of emulating functionality missing in a target browser. Stackoverflow handles the answer pretty well.

Googlebot blog

Now it should just all work for Googlebot.


Anyone that has been involved in SEO for a while will tell you that you need to take announcements like this with a pinch of salt. This isn’t having a go at Google, updating their rendering to Chrome 74 would have been a major undertaking worthy of praise and they themselves would recommend you still test, test and test again.

So, make sure you test your rendering using…. erm we have a problem! At the time of writing, none of the tools are updated to Chrome 74 but Zoe Clifford from the Google rendering team mentioned that is on its way shortly during the recent I/O talk (at timestamp 6:22) on the subject so keep your eyes peeled for news.

Also, it is worth noting that the current Googlebot user-agent hasn’t changed. This is deliberate to give people time to modify any systems that might use a hardcoded user-agent for things like dynamic rendering. This was also covered in the I/O talk at timestamp 5:52.

For Googlebot

Here is the real kicker. Why were you transpiling / polyfilling in the first place? If you were doing it for Googlebot solely then in theory you should be able to stop (test, test and test again).

If however you are doing it for users with older browsers, nothing changes, you still need to transpile and polyfill for them.

Can I ditch my prerender / dynamic rendering?

Here, it is important to understand why you are using this in the first place. If it is solely because Googlebot couldn’t render your site correctly, then by all means change things after testing.

If however it was because you don’t trust Google to fully render your JavaScript website correctly or in a timely fashion, or you care about Bing, Facebook etc. then you might be better off keeping your existing solution.

This update to Googlebot’s rendering capabilities does not change the fact that rendering is expensive and therefore carried out as a second (often significantly delayed) wave.

Zoe mentions this at timestamp 18:24.

Isomorphic / Universal JavaScript

Prerendering and dynamic rendering are workarounds for various issues with JavaScript websites relating to SEO. There is another, possibly better, way and that is to use a combination of server-side rendering (SSR) and hydration.

Zoe and Martin cover this very well in a simple form at timestamp 7.25.

The video.

Here is the full video which is well worth a watch.