Home SEO Javascript & SEO A Detailed Analysis

Javascript & SEO A Detailed Analysis

by Vishwajeet Kumar

Javascript & SEO somehow have correlation with each other. Thanks to the diversity offered by JavaScript, websites developed exclusively in HTML and/or CSS are becoming a thing of the past. But while JavaScript allows the user to have a more dynamic experience, developers face a minefield, increasingly difficult to ignore when we talk about making sure that their sites are properly read by Googlebot.

Javascript and SEO

How does JavaScript work?

For those who are not very familiar with this language, we explain a little:

Along with HTML and CSS, JavaScript is one of the 3 main web development languages. HTML is mainly used to create static websites (the code is displayed as it is and does not change according to the user’s interaction); whereas JavaScript makes a website dynamic. A programmer can use JavaScript to change the values and properties of a site’s HTML code so that the code responds to the user’s actions. As an example, an action by clicking on a button or a drop-down menu. The JavaScript code works in conjunction with HTML.

Execution of the client-side and server-side

There are two very important concepts that should be mentioned when talking about JavaScript and it is very important to understand their differences: The execution of the client side and the server side.

Traditionally, as is the case of static pages in HTML, the code is executed inside the server (server-side execution). When you visit a website, Googlebot receives all the content, ready to show, you just need to download the CSS file and display it in your browser.

On the other hand, JavaScript usually runs and is downloaded to the user’s machine (client-side execution). This means that Googlebot initially receives a site with no content, and then JavaScript creates a DOM ( Document Object Model ) that is used to load the content. This happens every time you enter a website.

Obviously, if Googlebot is not able to execute and download the JavaScript of your site properly it is impossible to see the content of the site that you want it to read. And here exactly is where most SEO problems occur in JavaScript.

How to verify that the JavaScript is executed correctly

Getting Googlebot to correctly interpret your website requires you to focus on understanding both the content and the links on the site. If Googlebot does not understand the links, it will be practically impossible to find them. And if you do not adequately capture the content of the website, Googlebot will not be able to see it.

Here are the steps you must follow …

1. Command “site:”

First of all the command “site”: it will show you how many pages of your website are indexed by Google. If many of them are not indexed it can still mean that there are problems in the execution of your internal links.

In addition, you may want to check if the content loaded in the JavaScript of your site is already indexed in Google.

To achieve that you must find a line of text that has not been present in your initial HTML code and that is only loaded once JavaScript is executed. After that look for this line of text in Google using the command “site: Sitename”

It is important to clarify that this will not work if you use the “cache” command : since the versions of your site that are stored in the cache are the originals ( static HTML ) and not the complete code.

2. Chrome 41

In August of 2017 Google updated its Search Guide and announced that they were using Chrome 41 to do the rendering. This was a radical change for SEO because from that moment you can already verify how Google downloads and views your site instead of supposing and expecting the best result.

Now you only need to download Google Chrome 41 and verify how a website or a page is executed and viewed by Googlebot.

3. Chrome DevTools

It is possible that certain parts of your JavaScript code are programmed to run only after your user performs an action (click, scroll, etc). Remember that Googlebot is not a user. You will not click or scroll down to see content that will cause you to lose access to information that only comes through an action.

The simplest and fastest way to verify that your JavaScript code is loaded without the need for actions is using the Google Chrome DevTools tools :

  • Open your site from Chrome
  • Open the “Elements” tab in DevTools
  • Verify how your site is downloaded by viewing the site’s built-in DOM in the browser – make sure all the crucial navigation content is present there.

We recommend doing it from Google Chrome 41. With this you will make sure that you will be seeing it as Googlebot does.

You can also do it through the version you have of Chrome to compare the information that is displayed.

4. Google Search Console

Another tool that helps us an idea of how Google download and read our website is through the function “Fetch and Render” in the Google Search Console .

First, you must copy and paste the URL of your site. Then choose the “Fetch and Render” option and wait a bit. This will allow you to verify if Googlebot can download and read your website, view related articles, copy or follow links.

Here you can also use Google’s Mobile-Friendly Test tool, which will also show you the JavaScript and download errors of your site that you may have.

5. Analysis of your server’s log

Another way to verify how Googlebot crawls on your site is through the analysis of your server’s log. By taking a deep look at them you can check if specific URLs were visited by Googlebot and which sections were not.

There are many elements that you can analyze thanks to the server log. For example, you can check if Googlebot visits your old articles, in case it does not, it could mean that there is a problem with the links which, in turn, would mean a problem with your JavaScript.

You can also check if Googlebot sees all the pages of your site, if it does not also it could mean that there is a problem with the decoding of the information.

Your server’s log will not show how Googlebot sees your website and its pages. You can only check if your site was visited and which response codes were sent. You will require a further and deeper analysis to determine if the problem really lies in your JavaScript.

In addition, in the log of your server, you will be able to verify if Googlebot requested crucial files of your JavaScript or ignored them completely.

Possible problems with the execution of your website

Even if your site is running correctly in “Fetch and Render” of the search console, it does not mean you can rest easy. There are still some other problems that you need to pay attention to.

Let’s start with one of the biggest problems you’ll have to solve …


Although the waiting times are not specified, it is said that Google can not wait more than 5 seconds for a script. It is important to remember that “Fetch and Render” is much more lenient than a common Googlebot, so you will have to go a step further to make sure that the scripts that are running are capable of being deciphered in less than 5 seconds.

Browser limitations

As mentioned earlier, Google uses a more or less ancient version of its browser to run websites: the Chrome 41from three years ago. And since JavaScript technology has evolved and continues to make leaps and bounds, some of its new features that work in recent versions of Chrome may not be supported in Chrome 41.

Therefore the best solution is to download the Chrome 41 browser (the exact version that Google uses to run sites) and familiarize yourself with it. Check the console log to see where the errors occur and ask the developers to correct them.

Content that requires user interaction to run

I know this was mentioned, but it’s worth repeating: Googlebot does not act like the user. Googlebot does not click on buttons, does not expand “read more”, does not fill out forms … it just reads and continues on its way.

This means that the entire content that you want to be read by Google must be uploaded immediately to the DOM and not after the action has been taken. This is particularly important for content in “read more” and menus.

What can I do to help Googlebot run websites better?

The execution of a website by Googlebot is one way. There are many things that developers can do to make the process easier, helping the things that you want Googlebot to read, be a little more obvious.

Avoid OnClick links

Search engines treat onclick = “windowlocation =” as ordinary links, which means that in most cases, they will not follow this type of navigation. And search engines almost certainly will not treat them as internal link signals.

It is crucial that the links are in the DOM before clicking. You can check this by opening the Developers Tools in Chrome 41 and confirm that the important links have already been executed without the need for action by the user.

Unique URLs for unique pieces of content

Each piece of your content must be located somewhere for the search engine to index it. This is why it is important to remember that if you change your content dynamically without changing the URL, you are preventing search engines from accessing it.

Avoid # in URLs

The identifier fragment is not backed by the Googlebot and is ignored. Then, instead of using the URL structure example.com/#url , try to stick to the clean URL format, example.com/url .

Avoid JavaScript errors

HTML is very forgiving, but JavaScript is not.

If your website has errors in JavaScript code , they are simply not going to run which could cause your site not to show up. A single error in the code can cause a domino effect, resulting in many additional errors.

To review the code and keep your JavaScript free of errors , you can once again use the Chrome DevTools and check the Console Tab, to see what errors occur and what lines of the JavaScript code.

Do not block JavaScript in robots.txt

Blocking JavaScript files is very old practice but it happens very often. Even many times by default in some CMS . And with the aim of optimizing the tracking budget (?), Blocking the JavaScript files (and the CSS style sheets) is considered a terrible practice. But do not believe me, this is what Google says on the subject:

“We recommend making sure that Googlebot can access all the resources used that contribute significantly to the visible content of your site or its design …”

So, do not do things like this:

Previous execution

When you discover that Google has a problem running the JavaScript code on your website, you can not use a preview.

The preview provides an HTML capture of your website. This means that Googlebot does not receive the JavaScript code but a pure HTML. At the same time, the user who visits the site gets the same version of your site, enriched with JavaScript.

The most popular solution is to use an external pre-execution service such as predender.io, which is compatible with the most important JavaScript frameworks.

Using this solution is quite simple, you just need to incorporate a middleware or snippet to your server.

In conclusion:

The topic of SEO in JavaScript is one of the most dynamic topics in the world of SEO and definitely deserves your attention, especially because it develops quickly. The problems described here are only a small part of everything that can be corrected and researched in order to make sure that Googlebot is running your website properly.

1 comment

Posts You May Like

1 comment

Kunjal Chawhan July 14, 2020 - 12:38 pm

JavaScript is very technical stuff, but it’s very important that SEOs know the basics of of it.

What I usually do is that I scan my website on browseo to see how my website appears when stripped off of JS whether the content is accessible or not. That’s the google bot sees your website.


Leave a Comment

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More