How can JavaScript affect SEO?

Last updated on: by Digamber
When Googlebot indexes websites, there might be some problems with the speed of this process because of the JavaScript that is used on the site.

The process of indexation can be delayed from a few days and up to a few weeks.A few experiments were carried out to see the difference in indexing websites with HTML or JS and here are the results:

  • HTML-based site – Googlebot indexed all pages at all levels.
  • JS-based site – robot didn’t even get to its second level in most cases.

You need fast indexing, but the pages of your site are with heavy JavaScript files. What is the solution?

There are three options that can make the indexing process go faster:

  1. Provide Googlebot with a pre-rendered HTML document
  2. Isomorphic Javascript
  3. Server rendering

1. Provide a pre-rendered HTML document prepared for Googlebot

We provide the robot with a pre-written HTML document for preliminary rendering by setting up a system for detecting its hits (when checking the User-Agent header). When the robot visits your site, you simply give it the HTML-copies of the pages (they shouldn’t contain JS code). Moreover, these copies are used only by bots, and not by ordinary users, they, in turn, receive versions of pages equipped with JS features. This method allows you to quickly index all pages of the site.

At the same time, you can view HTML-code (generated by Googlebot) and JS exceptions in the Google Search Console.

2. Isomorphic or Universal Javascript

When applied, both Googlebot and the user get all the necessary data when they load the page for the first time. Then JS-scripts are loaded that already work with these pre-loaded data. This option is good for users and search engines. What do you need for doing this? You can learn JS essentials and do it yourself or hire dedicated developers from Ukraine, like a company here, and save your time.

3. SSR

When using Server-Side Rendering (SSR) on the server side, we get a fast page-by-page transition through the site by the robot and the user. We should avoid working with functions that directly affect the DOM (document object model). If interaction with the browser’s DOM is necessary. It’s good to use Angular Renderer or abstraction.

For dynamic content rendering, you can use tools from Google Dynamic Rendering such as Puppeteer and Rendertron. As a result, the search robot receives the final result in the form of a full-fledged page with JS.

Server rendering is recommended to use if you have websites:

  • with frequently appearing content
  • with heavy JS code
  • with blocks of external resources (YouTube videos, social signal counters, online chat rooms)

But SSR has a number of drawbacks:

  • when the user has a slow Internet speed, the page loading quickness decreases
  • download speed also depends on the location of the server and the number of users who simultaneously use the application

Transferring rendering from the back side to the front side (Client Side Rendering) is even less productive from the SEO point of view. Since the robot loads a page with incomplete content a part of which is located in JavaScript.

The robot scans and renders pages without saving the state (it’s not supported):

  • service workers (the script is launched by the browser in the background separately from the page)
  • local storage (data storage between user sessions)
  • cookies, Cache API

What does it mean? Googlebot renders site pages without saving personal preferences and user settings.

It is worth noting that Googlebot no longer crawls URLs with a hash (link with characters in the tail after the # sign). An example of this kind of links is site.by/#backlinks.

What about images:

  • Google does not index images linked from CSS
  • If the site has a lazy image loading, you need to add a noscript tag around the image tag to make sure Googlebot scans them
Summary
The choice of the most appropriate variant is up to you. Think of the site specs and what tasks you want the UX part to solve. Each variant has its pros and cons. If we put SEO on the first place, rendering the app on the back side may let you avoid so-called empty pages problem.

It will accelerate site indexation by the robot.If you pick up Isomorphic or Universal JavaScript, you will make the pages of the site more user-friendly. This will also lead to a faster indexation of the pages and improving SEO metrics and page load rates.

positronX.io - Tamso Ma Jyotirgamaya
Digamber

A Full-stack developer with a passion to solve real world problems through functional programming.