How to Use Google Search Console to Find and Fix Errors on Your Website

Posted August 28th, 2019 in Analytics. Tagged: , , .

Google search console (GSC) is like a health report for your website’s efficiency, accessibility, and search engine rankings. It includes reports for backlinks, internal links, quality and quantity of web-traffic, errors, click through rates, mobile responsiveness, security issues, and crawling stats.

website fixing

In this article, we are going to cover how to find and fix errors identified in GSC. Googlebot crawls your website, reads and understands the content with its artificial intelligence, then indexes the pages and ranks them according to its 200+ ranking criteria.

If Googlebot faces any issues in crawling or indexing webpages, it shows them in GSC’s ‘Coverage report.’ Register your domain name with GSC and access the coverage report for free.

How to Use Google Search Console to Find Errors

It’s easy to find errors with the latest GSC version:

  1. Go to Google console and click on ‘Coverage’ from the left sidebar.
  2. You will see the list of errors under the ‘Details’ section.
  3. Click on the specific error, and Google Search Console will show you the error details.

Google Search Console - screenshot 1

Google Search Console - screenshot 2

The Most Common Errors in Google Search Console and How to Fix Them

There are a variety of errors Googlebot can encounter that are a hurdle to crawl and index webpages. Here, we’ll cover the 5 most common errors GSC displays, their meaning, and how to fix them.

1. Server Error (5xx)

If the detected error contains any 500-level error message (the error code can range from 500 to 511) the issue is with your server. The server is incapable of performing the request and loading the requested URL.

How to fix the 5xx error?

  • First of all, make sure your site’s hosting server is not down. Call your IT department or confirm the server’s functionality with your hosting company’s customer care. This sort of error is also possible if the server hosting the site is overloaded or misconfigured.
  • Make sure the hosting plan you have chosen is really capable of handling the amount of traffic your site is attracting. Hosting sites indicate the visitor handling capacity of a hosting plan with different words and phrases. Most commonly, hosting plans are limited by CPU and memory (RAM). Upgrade your hosting plan if necessary.
  • If you have a lot of duplicate URLs on your site, this could be causing Google to crawl a lot of extra pages on your site. Your server could then become overloaded. You can prevent Googlebot from crawling parameterized duplicate content.
  • Some of the safety features offered by hosting companies block website visitors if they find the visitor’s activity unusual. Googlebot makes requests faster and in a larger number than a human. Hence, the hosting site’s security tools might block it by falsely considering the Googlebot a danger. If this is the case, you can talk to your web host and ask them to loosen up some security standards or change a couple of configurations for your site.
  • If you are intentionally making your webpages inaccessible for Googlebot to control some aspects of crawling, use robots.txt file, configure URL parameters, or request a change in Googlebot’s crawl rate.

2. Client Errors (4xx)

You will see an error message in the 400-series when the server receives an invalid request from the client’s side. (Here, the client means the browser/website visitor and server means the hosting server where the website is hosted.) When Googlebot tried to crawl the webpage, the server didn’t allow it because it found some error in Googlebot’s request.

There are a total of 32 types of 4xx errors. The most common 4xx errors are 401 Unauthorized Request and 404 Resource Not Found/ URL not found/ URL seems to be a soft. Most often what you’ll see is a 404 error, which means the webpage Google was trying to crawl doesn’t exist.

How to fix 4xx errors?
There is no ‘one-size-fits-all’ solution for 4xx errors because each error has a different reason and hence, different solutions. We have covered the most common errors and their solutions below.

401 Unauthorized Request

This error pops up when users are not authorized to access the page without valid login credentials. So, either users haven’t provided valid ID and password yet, or the credentials are incorrect. When this error shows up in your GSC, it means that your webpage has not allowed Googlebot to crawl because it didn’t have a user ID and password.

How to fix it?
If you can’t authorize all the visitors to access certain pages without valid credentials due to security reason, at least allow Googlebot to crawl those pages without the need for authorization.

404 Resource Not Found/ URL not found

404 Resource Not Found/ URL not found error appears when the requested webpage/resource is not found by the server. If the webpage is deleted or the URL is broken or misspelled, a 404 error shows up.

How to fix it?
404 errors don’t necessarily harm your site. Check if the URL or link is broken or misspelled and update the link accordingly. If you deleted the webpage/resource intentionally, you don’t need to worry about fixing anything. Google console will stop showing this error in about a month. If the content has been moved to another place, add a 301 redirect. If you have added some URLs in sitemap but haven’t added those pages on your website yet or deleted them, Googlebot will still try to crawl those pages. You can remove those URLs from the sitemap until they are on your site.

404: Submitted URL seems to be a Soft 404

This is a type of 404 error which indicates that Googlebot has crawled the page, but there wasn’t any content on it, or there was a ‘thin content.’

How to fix it?
Delete the page if it’s not required. If the webpage is unavailable, add a 404 not found’ response code. You can also customize the 404-page by adding links to other relevant pages and resources on your website.

Check whether your content is taking too long to load with the help of the URL Inspection tool.

3. 3xx Redirection Errors

When you have redirected a page to another place, but Googlebot could not reach the redirected page due to some issues with the redirection, a 300-series error shows up. The main issues are as follows.

  • If the redirect chain that was too long or redirected URL exceeded the max URL length;
  • If there is an empty URL or misspelled URL in the redirect chain.

This is a very common redirection error when you are migrating your website from HTTP to HTTPS after getting an SSL certificate. The reason is that before installing an SSL certificate, all the pages, images, resources and files used to be on HTTP. Now every URL and every internal hyperlink must be changed to HTTPS.

The fix is simple:

4. Submitted URL marked noindex

If you have noindex in the meta tag in a page’s HTML code or as an HTTP response header, Googlebot will never show that page in Google Search results, even if it has links from/to other sites. If you have submitted such a page for indexing, GSC will show a “Submitted URL marked noindex” error. If you want Google to index such pages, simply remove the noindex directive from the meta tag or HTTP response header.

5. Submitted URL has crawl issue

This is a rare type of error message GSC shows when Googlebot faces any issues that don’t fall into any other category type. If you get this error, all you can do is keep finding and fixing errors using the URL Inspection Tool. For example, you may see this error if images, CSS, JavaScript didn’t load while Googlebot was crawling the page. Make the changes in your HTML and CSS codes once you figure out the issue and request indexing.

Google Search Console - screenshot 3

How does GSC know if you have fixed the error?

Once you apply the necessary fix to the error, click on the ‘Validate Fix.’ This will request Googlebot to crawl your webpage once again, and if the issue is solved, your webpage will be indexed quickly. You will see this page when you click on each error under the ‘Detail’ section.

Google Search Console - screenshot 4

Wrapping up

There is nothing as important as getting your webpage displayed in Google’s result pages. And for that, you have to help Googlebot in every possible way so that it can easily crawl and index your webpages. So keep checking GSC regularly, pay attention to all the challenges Googlebot faces, and fix them as early as possible!


About the Author

Sam Patel

Sam Patel is a Content Manager at CheapSSLsecurity.com. He specializes in explaining WordPress, website security, and digital marketing topics in easy-to-understand language for business owners and marketers.

Comments are closed.

  • Follow us

  • Browse Categories



  • Super Monitoring

    Superhero-powered monitoring
    of website or web application
    availability & performance


    Try it out for free

    or learn more about website monitoring
  • Superhero-powered monitoring
    of website or web application
    availability & performance
    Super Monitoring
    or learn more about
    website monitoring