How to Fix Sitemap Couldn’t Fetch and other Common Google Search Console Sitemap Errors

Sitemaps are useful to tell the search engines what are important pages available on the website for the search engines to crawl. Not every website need a sitemap. If your website is small and has less than 100 URLs, then it is not necessary to create and manage the sitemap. By interlinking the important links from your home page all pages can be easily accessible by the search engine.

Before fixing the Google Search Console error/warnings for the submitted sitemap, understand the basic sitemap guidelines then it will be a easy task to fix the issues.

Within this Article:

XML Sitemap Creation Important Rules

  • It is recommended to create the sitemap at the root of the website.
  • Ensure the sitemap is created and submitted to the preferred URL of the website.
  • Avoid non-canonical URLs, redirected URLs or 404 status URLs within it.
  • Use absolute URLs rather using relative URLs.
  • Ensure the sitemap is within 50 MB limit if uncompressed and maximum of 50,000 URLs.
  • Ensure the sitemap or any URLs within it is not blocked by Robots.txt.
  • Ensure sitemap file is UTF 8 supported.
  • Submitting the sitemap to Google, doesn’t mean all the URLs within the sitemap is crawled by Google bot.


1. Create Sitemap at the Root of the Website:

To allow all the URLs to be included on the sitemap and crawled by Google, place the sitemap at the root folder.
For example: Placing the sitemap like this will raise “URL not allowed” error and any URLs following the path after /folder is allowed but not the URL or any higher level.

2. Submit sitemap to the preferred version on Google Search Console:

Ensure the sitemap submitted is available in the same URL to which the property selected in the Google search console. Choosing different property or submitting different version of URL like http or non-www/www version will raise the problem of “sitemap couldn’t fetch” error on the Google Search Console. Trouble shoot by inspecting the URL to confirm whether the URL where the sitemap is placed is available to Google.

3. Avoid non-canonical URLs, redirected URLs or 404 status URLs within the sitemap:

All URLs listed within sitemap needs to return 200 OK HTTP status code and confirm those are canonical URLs. Usually it doesn’t raise any error under the google search console, but having non-canonical URLs or other wise called as duplicate URLs and 301, 404 or any status code other than 200 OK and unimportant or thin pages usually wastes crawl budget, also this impacts the crawling the more important pages of the website.

4. Use absolute URLs rather using relative URLs:

Google recommend to list all the URLs in the sitemap to be an absolute URL rather relative URL. Listing any absolute URL raise “URLs not followed” error in Google Search Console.
Absolute URL –
Relative URL – /folder/abc

5. Maximum sitemap size:

The recommended size of sitemap is 50 MB when it is uncompressed which can contain up to 50,000 URLs. For larger website this can break down and compile it by using sitemap index. As much as possible use maximum limit rather making too small sitemaps and compiling it by sitemap index. Exceeding the maximum limit raise raise “Sitemap file size error: Your sitemap exceeds the maximum file size limit” by Google Search Console.

6. Check Sitemap is not blocked by Robots.txt

The sitemap and all the URLs listed under it must be accessible by Google. Having that blocked by Robots.txt file will raise the error “Sitemap contains urls which are blocked by robots.txt”
For example:
User-agent: *
Disallow: /sitemap.xml
Disallow: /folder/

Here the sitemap is blocked and all the URLs within the /folder/ is blocked in Robots.txt file. Robots.txt file can be found at the root directory of any websites.

7. Sitemap file must be UTF 8 supported:

All automatically generated sitemaps are UTF 8 supported by default. If your creating manually make sure it is UTF 8 supported. It can’t support URLs containing special characters such as * or {}. Make sure to follow the appropriate escape code to support it.

The Bottom Line:

Having an XML sitemap submitted to Google is helpful for the search engine in crawling the URLs within the website. However it is not guarantee or Google pledge to crawl all the URLs listed in the sitemap and crawl more frequently. The more frequent you create useful content and updating the sitemap will help to enhance the Google to crawling the sitemap.

2 thoughts on “How to Fix Sitemap Couldn’t Fetch and other Common Google Search Console Sitemap Errors”

Leave a Comment