How the evergreen Googlebot has solved all JavaScript issues (oh no, wait…)

Google’s announcement of an ‘evergreen’ Googlebot this week sent ripples through the SEO industry, as Google will now use an up to date version of Chrome for their Web Rendering Service (currently Chrome 74, but it will update alongside the browser). Many surmised this could bring to an end crawling and indexation issues caused by the SEO’s greatest fear — a JavaScript based website.
While support for ES6 JavaScript and IntersectionObserver for lazy-loading (plus a thousand other features according to Google) is great news for developers and SEOs, as it removes the need to use polyfills specifically for Googlebot, we can’t help but feeling that this announcement should have been accompanied by a number of caveats.
Firstly, the Mobile Friendly test and the URL Inspection tool in Google Search Console haven’t yet been updated to Chrome 74, so it is not possible to see how your site is being rendered by the new Googlebot (they also haven’t yet changed Googlebot’s user-agent to reflect the switch to Chrome 74). Therefore if you’re trying to detect whether the upgrade has improved Google’s ability to render your site, you’ll need to wait a while.
Secondly the ability of the Web Rendering Service to render JavaScript like a modern browser does not mean that SEO considerations for JavaScript based websites no longer need to be followed.
SEO considerations for JavaScript sites
There are a number of issues that can cause JavaScript based sites to present a crawling or indexation challenge for Google including:
- Not using anchor tags which specify a correct href attribute. Google is unable to follow any JavaScript based click events (see slide from Martin Splitt)
- Content required for the page to rank is not being present on load. Content that is loaded via user interaction is not indexable by Google
- Crawlable pagination functionality powered by JavaScript, particularly infinite scroll
- Core Meta data (Title tag and Meta description) not included in the source HTML. This ensures that these primary signals of the pages content are indexable when the page is first crawled and are not reliant on the second pass by the Web Rendering Service (more of this later)
- Social Meta data not incorporated in the source HTML. Twitter and Facebook’s crawlers do not render JavaScript and therefore relying on client-side rendered Social Meta data will limit your presence on social channels
Dynamic Rendering is still required
In addition, to these key development requirements the upgrading of Googlebot does not mean this is the end for Dynamic Rendering. To provide a quick overview this is the delivery of server side rendered HTML snapshots to search engine crawlers via user agent detection.
The primary reason for this is that there is a delay between Google’s initial crawl of your website and the secondary crawl conducted by the Web Rendering Service. On the first pass Google parses your HTML, indexes any relevant content and identifies any JavaScript elements that will need recrawling with the Web Rendering Service. The second crawl by the Web Rendering Service where your JavaScript is executed can be conducted anywhere from a day to a couple of weeks after the initial crawl (so says John Mueller).
The potential issue raised by this for JavaScript based sites is that if you have a frequently changing product line up (think car dealership) or often use time sensitive offers on site, they may not be fully indexed until after the product has been sold or the offer has finished.
For this use case Dynamic Rendering is still required as the HTML snapshots will ensure your content is fully indexed with Google’s first crawl (Google state this in their Dynamic Rendering recommendations). To help with this Google announced this week that they have introduced a new Codelab for Dynamic Rendering, which one would assume is a step you wouldn’t have taken if the functionality was no longer required.
It’s also important to reiterate that this change is only relevant when Google is crawling your site. There are other search engines! Bing still recommends that you use Dynamic Rendering as their rendering service is not as advanced as modern browsers. Additionally if you have a presence in markets where Google is not the predominant search engine (Yandex and Baidu don’t render JavaScript), then you’ll still need Dynamic Rendering.
To wrap up…
The new evergreen Googlebot will certainly help with the development of JavaScript based sites as there will no longer be a need to use polyfills specifically for search engine crawlers, although you may still need to for your users (check which browsers they are using!).
However, as cited above there are still a number of SEO related requirements for JavaScript based sites which will not be impacted by the update to Googlebot. Therefore it is still important to ensure that your site is being rendered correctly by Google (once the tools are updated).
Dynamic Rendering is here to stay, it is still a requirement for search engines other than Google and is necessary for sites where content is updated frequently.
Finally the rendering of JavaScript carries significant performance overheads for Google and even they do not have unlimited processing power. Therefore should there be a delay in the critical content of your page being rendered it could still lead to indexation issues. If you want to make sure your content is fully indexed it’s probably best not to rely on a third party, even if it is Google!