When search engine bots find a link to your website, they then start to visit each page from there. The crawlers or bots crawl each page and index the contents which come up when a user performs a search query. Crawl errors happen when search engine crawlers cannot reach a page on your website. This post is about common SEO crawl errors and how you can fix them.

It’s a very first step when you conduct technical SEO audit. You need to make sure that the search engine crawlers can get to all the pages on your site. Crawl errors come up when search engine bots cannot successfully crawl a web page. These errors are divided into two groups, site errors and URL errors. There is a group of special crawl errors that we name as specific URL errors because they do not fall into either of the mentioned categories.

Site errors, Common SEO Crawl errors and how to fix them

Site errors is a common SEO Crawl error

Site errors mean that the entire website cannot be crawled. Specially, using JavaScript for website design always involves the risk of crawlability and indexability. Don’t forget to read our blog on JAVASCRIPT AND SEO: BEST PRACTICES FOR JS BASED SITES for eliminating the risk. This affects a website in its entirety and so needs to be paid attention immediately.  The common types of site errors are DNS errors, server errors, and robots failure.

You can fix crawl errors in the Google search console. Google shows site errors which have occured in the last 90 days in the dashboard. Make sure to check every 2 to 3 months for errors. Here is how you can resolve google bot web crawl issue:

  • DNS error: This means that the search engine crawlers cannot connect to the domain due to not being able to communicate with the server. To fix this, first view how Google bots crawl your page. If Google cannot fetch and render your page properly, check with the DNS provider to see what the issue is. While it is being fixed, ensure that your server displays 404 or 500 error codes.
  • Server errors: Server errors generally occur when the search engine crawlers try to visit a website but it takes too long to load. It means that even though search engine bots can connect to your website, they cannot load the page. It may also occur due to flaws in a code or there might be too many visitors for the server to handle. To fix this; firstly use ‘fetch as Google’ to see if the bots can access your website. Then, diagnose which kind of server error it is. The types of server errors are timeout, truncated headers, connection refused, connection reset, truncated response, connect failed, no response etc.
  • Robots failure: Robots failure happens when google bots cannot reach the robots.txt file. Thus, the google bot will not be able to index new pages of your website. To fix this, inspect the proper configuration of robots.txt file. To resolve google bot web crawl issue, check “Disallow:/” and which web pages you do not want the search engines to crawl.

URL errors, Common SEO Crawl errors and how to fix them

URL errors

These affect specific web pages on your site and are easier to fix than site errors. URL errors happen when a search engine bot cannot crawl a particular page of a website. Soft 404, 404, access denied, and not followed are common SEO crawl errors which fall into the URL errors category.

You might also get specific URL errors. Specific URL errors occur in particular kinds of websites. These errors include mobile specific URL errors and malware errors. Like, static URL errors are quite common while doing ecommerce platform migration. So, how to fix search engine crawl errors relating to URL? Read the points below:

  • Soft 404: This is when the error displays 200 (found) instead of 404 (not found). For pages which no longer exist, 404 your web page or redirect to a relevant page on your website.
  • 404: This is one of the common SEO crawl errors and means that the bot tried to crawl a page which no longer exists. To fix this, do 301 redirect to the most relevant page or make the page live again.
  • Access denied: Preventing crawlers from crawling the web page results in this error. To fix it, check your robots.txt file that resolves google bot web crawl issue.

So, whenever you encounter common SEO crawl errors, fix them as soon as possible. Frequently check for crawl errors on your website. Fixing these errors improve your rankings and also provide good user experience to your visitors. Another important thing is to improve crawlability by using these advanced SEO techniques:

UNI SQUARE CONCEPTS

Uni Square Concepts is an advertising agency located in New Delhi, India. By initiating The Uni Square Blog, we aim to provide a comprehensive portal where readers can educate themselves about the various aspects of advertising and marketing. The articles and blogs are written by our professional team of content writers, under the guidance of senior leaders of Uni Square Concepts including its CEO, Uday Sonthalia.