fbpx
Skip to content
Elvinwebmarketing | Uncategorized | What are Website Crawl Errors and How to fix Them?

What are Website Crawl Errors and How to fix Them?

Website crawl error

Website crawl issues are annoying at best and can affect SEO

For the sake of uniformity and simplicity, this article explains website crawl errors in terms of what Google considers them to be. Google’s “Search Console” is the ubiquitous destination for webmasters seeking information about the health of their website. It is in your interests to keep crawl errors in check and do the necessary preventative maintenance, and repairs where required.

Website crawl errors are usually divided into site errors and URL errors.

On the Google Search Console dashboard, site errors give you a quick holistic view of those errors that affect the site’s performance. The data is for activity over the past 90 days.

It is important that your website remains error free with respect to site errors. Attend to any “error” messages as soon as possible.

DNS errors are a type of site error. Google’s bots facing issues with a site’s DNS may be unable to connect with the site. DNS timeout and DNS lookup are two issues to be wary of. Since a DNS connection is the first step in ensuring site uptime, related issues must be tackled as soon as possible.

To fix DNS errors, you should use Google’s Fetch as Google tool. You may choose to fetch or fetch and render a page. With the former, you can learn about the status of the sites DNS connectivity. With the latter, you can get an idea of how Google views your site as compared to a human user. If Google cannot fetch, then check with the webhost for any issues at their end. Your server should display the appropriate code – 404 or 500.

ISUP.me is one of the many useful tools that you can use to check if the site is down.

Server errors are another type of site error. These happen when the Googlebot wanting to crawl the site is unable to do so because of a server timeout, i.e. the server takes too long to respond. Unlike DNS errors, with server errors it is possible for search engine bots to connect to the site. However; the page does not load because of server connectivity issues. These issues happen when the server is unable to handle the incoming traffic. The solution is to ensure that the web servers can be scaled smoothly and quickly to handle spikes in traffic.

Server errors require urgent remedial action. If such an error shows up on the search console, then you need to fix it. As a rule of thumb, if the search bot can crawl your homepage, then your site’s working fine with respect to server errors. There are different types of server errors that you must keep your eyes peeled for. These are timeout, truncated headers, connection reset, truncated response, connection refused, connection failed, connection timeout, and no response.

URL errors are a type of crawl errors that affect individual pages, and not the entire site. These are broken down into errors by device type – desktop and mobile devices.

Since there are different types of URL errors, and these happen for individual pages, it is possible that once in a while you may be overwhelmed with URL errors that appear on Google Search Console. One easy way to rid your site of as many as possible is to mark all errors as fixed. Errors that resolve on their own will not show up the next time Googlebots visit your website. You can focus attention on those that reappear on the dashboard.

404 is a common URL error. The error occurs when the Googlebot tries to crawl a page that does not exist. Google has specified that 404 errors do not affect site rankings. But this does not mean that these errors can be ignored, especially if the page attracts traffic and has valuable backlinks pointing to it.

Repairing a dead page showing a 404 error is simple. Bring the page back to life or use a 301 redirect to take the visitor to an appropriate page.

Sometimes, the Googlebot is unable to access a page. Access denied to Googlebots prevents it from crawling pages. Such errors occur if the site requires a login before accessing the URL. When that happens, the Googlebot is automatically blocked.

If you have chosen to block Googlebot via the robots.txt file, then too your site won’t be crawled. Or, it may be that your webhost requires user authentication before access is granted.

This website crawl error type may require immediate attention if the pages made inaccessible to the search bots are your money pages. If the blocked pages are driving traffic or are capable of generating organic traffic, then you want them to be accessible to the search bots.

In order to redress this error, you can enable access by not insisting on a log in. The “Fetch as Google” tool lets you see how your page appears to the Googlebot.

With this learning and some practice, you will be able to keep track of and repair errors. Understanding the severity of errors and required urgency will become easier.

To learn more about websites and how we can help you, send us a message.

Leave a Reply

Your email address will not be published. Required fields are marked *

Get the latest online marketing tips and advice by email

Plus receive a coupon for 10% Off our first month's SEO Services
Share via
Copy link
Powered by Social Snap