Read our blog on WHAT IS HTTP2: IMPACT ON SEO AND HOW TO IMPLEMENT IT that tells you to how to use this protocol to reduce page load times if you have a JS based website.
It is difficult to crawl and render JS for search engines due to its complexity. In the case of traditional HTML, the Googlebot downloads HTML file, extract the links from source codes, downloads CSS files, and sends the downloaded resources to the indexer which then indexes the page.
However, when it comes to a JS based website, the whole process becomes more complex. Google firstly downloads HTML, CSS and JS files, and then uses Google web rendering service to parse and execute a JS code. After that, the web rendering service fetches data from APIs and database after which the indexer can index the webpage.
- Initially the browser and googlebot executes a GET request for HTML code and other assets of a website.
- The JS script delivers a document object model to the browser. This shows the connection between elements of the website. The browser then renders information and makes it visible for the user.
- After all of this, the website loads fully.
JS events that affect SEO
- Load event: When a website is completely loaded, this event is fired by the browser. Search engines don’t consider to crawl those elements that are loaded after the load event. Therefore, search engines do not index them. So, it’s important to not load the critical content after the load event.
- Blocked resources: There are some internal and external resources which a website owner requires for rendering the webpage. Make sure that those pages are not blocked. Check your robots.txt to make sure no critical resources are being blocked for the Googlebot. Checking robots.text is one of the 8 STEPS TO A BETTER TECHNICAL SEO AUDIT that improves SEO.