Sitemap Rules
-
Hello there,
I have some questions pertaining to sitemaps that I would appreciate some guidance on.
1. Can an XML sitemap contain URLs that are blocked by robots.txt? Logically, it makes sense to me to not include pages blocked by robots.txt but would like some clarity on the matter i.e. will having pages blocked by robots.txt in a sitemap, negatively impact the benefit of a sitemap?
2. Can a XML sitemap include URLs from multiple subdomains? For example:
http://www.example.com/www-sitemap.xml would include the home page URL of two other subdomains i.e. http://blog.example.com/ & http://blog2.example.com/
Thanks
-
Theoretically, if the URL is blocked by robots.txt it should not appear in the index results no matter if they are in the sitemap but I have seen URLs indexed that are blocked by robots.txt but are in the sitemap and have good links pointing to it. If you want to block pages that have good links pointing to them, my advice is to remove them from sitemap. #justathought.
About URLs from multiple domains, I personally create separate sitemaps for different subdomains and link to main sitemap and I see better indexing that way.
Again, these are my personal experiences and not rules so please do keep that in mind as things can be different fro them.
-
Hey,
1.) Yes you can do this and it won't 'negativel impact it' but it might cause a couple of Search Console errors when you come to submit the URLs - blocking crawlers in the robots.txt file is a directive that instructs them not to crawl that particular page. With this being said, supplying them with a sitemap of all page locations will not mean that they crawl these pages, but it is an instruction to crawlers that these pages do exist. Personally, I would meta noindex these pages to make sure that they don't reach search engines as the blocking in the robots.txt file can often not be enough to prevent this, especially if you're also submitting a sitemap.
2.) In short, I don't think you can have a single XML sitemap containing URLs from multiple subdomains BUT you can have sitemaps for multiple subdomains hosted on the TLD individually. Google have broken this down really well in their Webmaster Tools post:
https://support.google.com/webmasters/answer/75712?hl=en&topic=8476&ctx=topic
Hope this helps!
Sean
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap use for very large forum-based community site
I work on a very large site with two main types of content, static landing pages for products, and a forum & blogs (user created) under each product. Site has maybe 500k - 1 million pages. We do not have a sitemap at this time.
Technical SEO | | CommManager
Currently our SEO discoverability in general is good, Google is indexing new forum threads within 1-5 days roughly. Some of the "static" landing pages for our smaller, less visited products however do not have great SEO.
Question is, could our SEO be improved by creating a sitemap, and if so, how could it be implemented? I see a few ways to go about it: Sitemap includes "static" product category landing pages only - i.e., the product home pages, the forum landing pages, and blog list pages. This would probably end up being 100-200 URLs. Sitemap contains the above but is also dynamically updated with new threads & blog posts. Option 2 seems like it would mean the sitemap is unmanageably long (hundreds of thousands of forum URLs). Would a crawler even parse something that size? Or with Option 1, could it cause our organically ranked pages to change ranking due to Google re-prioritizing the pages within the sitemap?
Not a lot of information out there on this topic, appreciate any input. Thanks in advance.0 -
My video sitemap is not being index by Google
Dear friends, I have a videos portal. I created a video sitemap.xml and submit in to GWT but after 20 days it has not been indexed. I have verified in bing webmaster as well. All videos are dynamically being fetched from server. My all static pages have been indexed but not videos. Please help me where am I doing the mistake. There are no separate pages for single videos. All the content is dynamically coming from server. Please help me. your answers will be more appreciated................. Thanks
Technical SEO | | docbeans0 -
What are the negative implications of listing URLs in a sitemap that are then blocked in the robots.txt?
In running a crawl of a client's site I can see several URLs listed in the sitemap that are then blocked in the robots.txt file. Other than perhaps using up crawl budget, are there any other negative implications?
Technical SEO | | richdan0 -
Not All Submitted URLs in Sitemap Get Indexed
Hey Guys, I just recognized, that of about 20% of my submitted URL's within the sitemap don't get indexed, at least when I check in the webmaster tools. There is of about 20% difference between the submitted and indexed URLs. However, as far as I can see I don't get within webmaster tools the information, which specific URLs are not indexed from the sitemap, right? Therefore I checked every single page in the sitemap manually by putting site:"URL" into google and every single page of the sitemap shows up. So in reality every page should be indexed, but why does webmaster tools shows something different? Thanks for your help on this 😉 Cheers
Technical SEO | | _Heiko_0 -
XML Sitemap and unwanted URL parameters
We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing. So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong? Thanks !
Technical SEO | | jfmonfette0 -
Sitemap do they get cleared when its a 404
Hi, Sitemap do they get cleared when its a 404. We have a drupal site and a sitemap that has 60K links and i want to know if in these 4 years we deleted 100's of links and do they have them automatically cleared from Sitemap or we need to build the sitemap again? Thanks
Technical SEO | | mtthompsons0 -
Sitemap all of a sudden only indexing 2 out of 5000+ pages
Any ideas why this happened? Our sitemap looks the same. Also, our total number of pages indexed has not decreased, just the sitemap. Could this eventually affect my pages being in the index?
Technical SEO | | rock220 -
Sitemap for 170 K webpages
I have 170 K pages on my website which I want to be indexed. I have created a multiple HTML sitemaps (e.g. sitemap1.html, sitemap2.html,...etc) with each sitemap page having 3000 links. Is this right approach or should i switch to xml based sitemaps and that too multiple one. Please suggest.
Technical SEO | | ArtiKalra0