How to Fix Excluded by noindex tag or Couldn’t Fetch Error in Search Console

SEO ensures that Google can crawl your website and index all your primary pages. Google is sometimes picky about what pages on a site it will index. Google console error down your traffic you know better about this if traffic is down what you feel.

Reason For this Error

  • You Change Your Hosting
  • You Change Your Hosting Not Check Proper Nameserver
  • If you use Cloudflare Not Remove Old Site Data.
  • In Robot.txt you Disallow Google bot

Couldn’t Fetch Sitemap error On Google Search Console Solved

  1. Login To Google Search Console
  2. Click on “Sitemaps” on the left panel/Menu
  3. On add, a sitemap, enter URL of the sitemap you are trying to index
Couldn’t Fetch Sitemap error On Google Search Console

How to Fix The Couldn’t Fetch Google Search Console Error

  1. Login To Google Search Console
  2. Click on “Sitemaps” on the left panel/Menu
  3. Remove Old Sitemap
  4. On add, a sitemap, enter URL of the sitemap you are trying to index
  5. Add a Forward slash “/” just after the last forward slash in the URL (See Screenshot)
  6. Click on Submit

Sitemap could not be read general HTTP error

sitemap could not be read general http error
sitemap could not be read general HTTP error

My sitemap. It’s throwing an error “Sitemap could not be read – General HTTP error
My sitemap is: https://www.domainmame/sitemap_index.xml

URL Inspection

URL Inspection
URL Inspection

How to Fix Excluded by noindex tag in Search Console

Block search indexing with ‘noindex’.Google Search by including a noindex meta tag in the page’s HTML code, or by returning a ‘noindex’ header in the HTTP request.

Submit Site Map in Google
Submit Site Map in Google

When Googlebot next crawls that page and sees the tag or header, Googlebot will drop that page entirely from Google Search results, regardless of whether other sites link to it.

noindex tag in Search Console
noindex tag in Search Console

Implementing noindex

<meta> tag

To prevent most search engine web crawlers from indexing a page on your site, place the following meta tag into the <head> section of your page:

<meta name="robots" content="noindex">

To prevent only Google web crawlers from indexing a page:

<meta name="googlebot" content="noindex">

Submit Site Map in Google

Submit SiteMap in Google
Submit SiteMap in Google

Method 1 Fix: “No: ‘noindex’ detected in ‘X-Robots-Tag’ http header”

Check Plugin Update and if use Yoast SEO Go to Tools Option Click on File editor

Now File Editor open and Check now.

Check Which Option robots.txt Allow Or Disallow.

Method 2 Fix: “No: ‘noindex’ detected in ‘X-Robots-Tag’ http header”

If you use cloudflare open your cloudflare account.

  1. Click the Firewall tab
  2. Click the Managed Rules sub-tab
  3. Scroll to Cloudflare Managed Ruleset section
  4. Click the Advanced link above the Help
  5. Change from Description to ID in the modal
  6. Search for 100035 and check carefully what to disable
  7. Change the Mode of the chosen rules to Disable

Go to Firewall settings > Managed Rules, and turn off Cloudflare Specials to fix this temporarily until CF has a solution.

No: 'noindex' detected in 'X-Robots-Tag' http header
No: ‘noindex’ detected in ‘X-Robots-Tag’ http header

Now Go To DNS Setting in cloudflare.

cloudflare blocking googlebot
cloudflare blocking googlebot

Click edit and Delete

Now Delete all websites from Cloudflare and change DNS with your real hosting DNS. Now-Again opens Cloudflare and Add new Site and Change DNS.

Save it.

Now go to google console and submit sitemap in google console.it submit sucess.

submit sitemap in google console

if you have to face the same issue on your website contact us we help to remove your website error. Share with those who face this problem. Thank you

Conclusion

In the world of search engine optimization, encountering roadblocks just like the “Excluded by way of noindex tag” or “Couldn’t Fetch” errors can be frustrating. However, armed with the information of their reasons and solutions, you are better ready to navigate these demanding situations and make sure your internet site’s surest overall performance in seek engine consequences.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.