JavaScript and SEO: problems and how to tackle them


Many SEO specialists complain that JavaScript (JS) often complicates their work. Nevertheless, JS is the most popular programming language in the world. It is used on 95% of all web resources. During the last few years, JS has become overgrown with libraries and plugins. Now it’s a really convenient language for a developer to work with. Besides, it’s very universal. Almost any device, any browser renders JS code.

These days, 99% of all commercial websites are created with the use of JS. Despite all the advantages provided by this language, SEO experts and marketing managers have to deal with a number of issues related to the promotion of websites and web apps. How to cope with those problems?

  1. Rendering issues

The most widespread reasons are low loading speed and the unresponsive interface that worsens user experience. It’s very important how a visitor perceives a website for obvious reasons. Although, not every developer realizes that Google is trying to see a web page through users’ eyes too. That’s why user experience so much impacts how a robot ranks a website. Besides, rendering issues are often associated with the term called bounce rate. When it is low, it means that visitors tend to leave your website. They do so precisely because of the mentioned issues.

There are two types of rendering: server-side and client-side. Problems with rendering are rare when it’s executed on the server-side. If the content is visualized on the client side, you should make sure that Google can scan and process your web pages. Note, that even if a server transfers all the content to the client side, search bots wait until the page’s full version is rendered.

  • Executable code size

It influences the loading speed and rendering. Users who access your website via old devices may face performance problems like delayed rendering and various lags. It happens when script size is too big and as a result, is executed slowly.

If it takes more than 5 seconds for your script to load, the crawler regards it as bad timing. Google may decide that its robots slow down your website and limit the scanning frequency. To avoid this, try to make your website less heavy. Ensure the high speed of the server response and make sure that it is able to manage with the heavy load. You can also use optimized UI components from a JS framework or library.

A widespread mistake is when a developer places all the components in one file. It slows down the page considerably. When users access a website, they don’t need an admin panel to load. Decide on the elements which should be loaded for a user and put them into a separate file.  

3. A small mistake can lead to big problems

The only one typo in JavaScript code can lead to a situation when Google won’t be able to index the page. So, to reap the benefits of software development with JS, programmers should do their best to avoid mistakes.

4. Slow scanning of websites with JS

It takes much more time for Googlebot to scan a website with JS than an HTML website. The reason for this lies on the surface. HTML code loads straight away, whereas JS is first parsed, compiled, and only after that, executed. All these actions extend the loading time of the page. And if you consider the fact that up to 50% of users access websites via mobile devices, every second counts.

You can improve the situation by choosing an appropriate rendering strategy. Consider server-side rendering or pre-rendering, when JavaScript is executed on the server. It can improve performance for both users and crawlers. In this case, the browser starts rendering HTML straight away, and the page loads almost immediately.

5. Scanning and indexing technologies lag behind

During the last few years, JS and the Chrome browser have seen a lot of changes. Although, Google uses Chrome 41 for rendering and further indexing of web pages. The problem is that this version saw the light in 2015 (!) and has many technical restrictions as compared to fresher releases. Therefore, it would be logical to check how your website is rendered in this particular browser version. If you detect problems, have a look at the Chrome 41 console to learn what can be the reason for this.

To make a website with state-of-the-art capabilities which are correctly rendered in the browser, you can:

  • Use polyfills. It means adding a piece of code allowing you to replicate a functionality that some web browsers don’t support.
  • Create a simplified website’s version. It’s a so-called Graceful Degradation when a page keeps on working and rendering despite the fact that a browser doesn’t support some of its functionalities.
  • Transpile the code into ES5 standard. It will allow you to convert pieces of code which Googlebot can’t process into those ones that it can crawl.  

Bottom line

SEO specialists often call JS their main bugbear. But as you have already noticed, many issues that impede the promotion of websites created with the use of this programming language can be solved. At the same time, JS makes web pages more dynamic, thus improving user experience.