Is there a way to keep sitemap.xml files from getting indexed?
-
Wow, I should know the answer to this question.
Sitemap.xml files have to be accessible to the bots for indexing they can't be disallowed in robots.txt and can't block the folder at the server level.
So how can you allow the bots to crawl these xml pages but have them not show up in google's index when doing a site: command search, or is that even possible? Hmmm
-
ahhh no index in .htaccess file. brilliant - thanks!
-
Usually you would need to add noindex to the meta robots tag in the of your web page. Because your sitemap is an XML file and not HTML you will need to do things differently.
You can add the code below to your .htaccess file, which you can find in the ROOT folder of your server. Open the file in a plain text editor and insert the following:
Header set X-Robots-Tag "noindex"
This will stop search engines indexing your sitemap without restricting them from crawling it.
Note: Replace 'sitemap.xml' with your file name - if different.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I submit an additional sitemap to speed up indexing
Hi all, Wondered if there was any wisdom on this that anyone could impart my way? I'm moving a set of pages from one area of the site to another - to bring them up the folder structure, and so they generally make more sense. Our URLs are very long in some cases, so this ought to help with some rationalisation there too. We will have redirects in place, but the pages I'm moving are important and I'd like the new paths to be indexed as soon as possible. In such an instance, can I submit an additional sitemap with just these URLs to get them indexed quicker (or to reaffirm that indexing from the initial parse)? The site is thousands of pages. Any benefits / disadvantages anyone could think of? Any thoughts very gratefully received.
Intermediate & Advanced SEO | | ceecee0 -
How i get link to my website
hi i'm very new in seo want to have links to my website:www.warningbroker.com how i can get links to my website?
Intermediate & Advanced SEO | | marketing660 -
HTML or XML sitemap - benefits
Hi all, Can I use only HTML sitemap or I should use both versions?
Intermediate & Advanced SEO | | Tormar
How much I would lose in case when I would lose only HTML sitemap, without XML sitemap? Thank you.0 -
Is there a way to get my company to appear on its own map individually on mobile instead of with other companies?
For example: If I type in certain keywords for example like exterminating termites tucson on an apple iphone in google organic ranking it will appear with wildcat exterminating with its own individual map in the organic ranking section. How can my company get something like this? Thanks.
Intermediate & Advanced SEO | | Nwext1 -
How do I create a XML Sitemap?
It appears that the free online tools limit the number of URLs they'll include. What tools have you had success with?
Intermediate & Advanced SEO | | NaHoku1 -
Sitemap not indexing pages
My website has about 5000 pages submitted in the sitemap but only 900 being indexed. When I checked Google Webmaster Tools about a week ago 4500 pages were being indexed. Any suggestions about what happened or how to fix it? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
Best way to get the keyword ranking at the top
I am working on site for around six months now.
Intermediate & Advanced SEO | | ray2810
I have done social bookmarking submission, directory submission, blog comments, forum submissions etc. Is there anything else i can do to make the rank go higher. nothing is working correctly.0 -
Need to duplicate the index for Google in a way that's correct
Usually duplicated content is a brief to fix. I find myself in a little predicament: I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal. The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads. So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines. I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path. So the question is: Have someone else stumbled upon a similar problem? If so...? How did you fix it.
Intermediate & Advanced SEO | | Gustav-Northclick0