Omitting URLs from XML Sitemap - Bad??
-
Hi all,
We are working on an extremely large retail site with some major duplicate content issues that we are in the process of remedying. The site also does not currently have an XML sitemap.
Would it be advisable to create a small XML sitemap with only the main category pages for the time being, and then after our duplicate content issues are resolved, uploading the complete sitemap? Or should we wait to upload anything until all work is complete down to the product page level and canonicals are in place? Will uploading a incomplete sitemap be fraudulent or misleading in the eyes of the search engines and prompt a penalty, or would having at least the main pages mapped while we continue work be okay?
Please let me know if more info is needed to answer! Thanks in advance!
-
Some good answers here, so I'll just throw in my own 2 cents.
The purpose of a sitemap is to help search engines find pages they might not otherwise find during a regular crawl. Sometimes sitemaps can help pages get indexed faster. Other sitemaps serve special purposes, such as News or Video sitemaps, which can add extra information and help ranking particular types of content.
In reality, many, many sitemaps are incomplete, missing, or flat out wrong. To my knowledge, no search engine will penalize you for this, as they would be penalizing half the web.
The danger of an inaccurate sitemap is that the search engines may chose to ignore it completely. Daune Forrester of Bing has stated that if they find a 1% error rate in your sitemap file, then they will disregard the file. However, no such action is known to exist for incomplete sitemaps.
So I'd say there is little in submitting a sitemap of your truly important page. Unfortunately, this won't stop Google from discovering or crawling your duplicate content issues.
The faster you get these fixed, the better.
-
Hi Thomas,
Definitely comforting to hear that you ran your site with an incomplete sitemap without seeing any negative results. Like I said in my response above, I think we will proceed with the partial sitemap, just to have one on there, and then upload a complete one once we can clean up the site a little more. Thanks for your insights - they were very helpful!
-
Hi Saijo,
Thanks for your recommendations! We do want to place a little more emphasis on our top level and main navigation pages, so I think we will probably proceed with a preliminary sitemap with just those pages for now. Once we get to that point, we'll definitely be needing to use multiple sitemaps and an index - thanks for pointing this out!
-
I ran my site with an incomplete site map for years and didn't seem to have a negative effect. I feel that any site map is better than no sitemap. Sitemaps are such a small part of the SEO equation. What they are most useful for is telling Google what to crawl. Beyond that, I don't believe they have much relevance in passing authority.
-
My Theory On this ( I have no tests to prove this )
If you upload a verified site map thats is essentially telling Google these are the important pages on my site. would you want to risk the importance of the other pages by telling Google you consider only the few important categories as the important ones . They wont drop the other pages completely but MIGHT see them as less important .
I really don't see Google penalizing a site for incomplete sitemaps.
If you have a really large site you might also want to look in to multiple sitemaps : http://googlewebmastercentral.blogspot.com.au/2006/10/multiple-sitemaps-in-same-directory.html
You might also want to look in to the best situations to use rel canonical vs a 301 redirect .
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting 'Indexed, not submitted in sitemap' for around a third of my site. But these pages ARE in the sitemap we submitted.
As in the title, we have a site with around 40k pages, but around a third of them are showing as "Indexed, not submitted in sitemap" in Google Search Console. We've double-checked the sitemaps we have submitted and the URLs are definitely in the sitemap. Any idea why this might be happening? Example URL with the error: https://www.teacherstoyourhome.co.uk/german-tutor/Egham Sitemap it is located on: https://www.teacherstoyourhome.co.uk/sitemap-subject-locations-surrey.xml
Technical SEO | | TTYH0 -
How does changing sitemaps affect SEO
Hi all, I have a question regarding changing the size of my sitemaps. Currently I generate sitemaps in batches of 50k. A situation has come up where I need to change that size to 15k in order to be crawled by one of our licensed services. I haven't been able to find any documentation on whether or not changing the size of my sitemaps(but not the pages included in them) will affect my rankings negatively or my SEO efforts in general. If anyone has any insights or has experienced this with their site please let me know!
Technical SEO | | Jason-Reid0 -
Changing URLs
As of right now we are using yahoo small business, when creating a product you have to declare an id, when we created the site we were not aware that you will not be able to change the id but also the ID is being used as the URL. we have a couple thousand products in which we will need to update the URLs. What would the best way to be to fix this without losing much juice from our current pages. Also I was thinking that if we did them all in a couple weeks it would hurt us a lot, and the best course of action would be to do a slow roll out of the URL changes. Any help is appreciated. Thank you!
Technical SEO | | TITOJAX0 -
301 Redirects Relating to Your XML Sitemap
Lets say you've got a website and it had quite a few pages that for lack of a better term were like an infomercial, 6-8 pages of slightly different topics all essentially saying the same thing. You could all but call it spam. www.site.com/page-1 www.site.com/page-2 www.site.com/page-3 www.site.com/page-4 www.site.com/page-5 www.site.com/page-6 Now you decided to consolidate all of that information into one well written page, and while the previous pages may have been a bit spammy they did indeed have SOME juice to pass through. Your new page is: www.site.com/not-spammy-page You then 301 redirect the previous 'spammy' pages to the new page. Now the question, do I immediately re-submit an updated xml sitemap to Google, which would NOT contain all of the old URL's, thus making me assume Google would miss the 301 redirect/seo juice. Or do I wait a week or two, allow Google to re-crawl the site and see the existing 301's and once they've taken notice of the changes submit an updated sitemap? Probably a stupid question I understand, but I want to ensure I'm following the best practices given the situation, thanks guys and girls!
Technical SEO | | Emory_Peterson0 -
Duplicate blog URLs in Magenton
On one my sites Moz is picking up 4483 duplicate content pages. The majority of these are from our blog and video sections on our site. We're using a URL shortener and it appears that some of the pages are the full version of the URL then the shortened version. However if you go to the full version you get redirected to the shorter one. So I would assume that the Moz crawler should get the same redirect? We're also getting pagination being shown as duplicate pages, which I would half expect, but the URLs Magento is creating are truly bizarre: e.g http://www.xxx.com/uk/blog/cat/view/identifier/news/page/news/index.php/alarms-doorbells/?p=2 Alarms and doorbells is one of our product categories, which is displayed in the LHN on the blog page but has nothing to do with the blog itself. On another site on the same Magento instance, with the same content (they're for two different regions) we're show as having 248 duplicate pages, again in the video and news section, but this is a completely different scale of issue. Has anyone else encountered issues like these? I'm probably going to put a noindex in place on these two sections until we can get a solution in place as we're completely unranked in google on this site. Thanks
Technical SEO | | ahyde0 -
I have altered a url as it was too long. Do I need to do a 301 redirect for the old url?
Crawl diagnostics has shown a url that is too long on one of our sites. I have altered it to make it shorter. Do I now need to do a 301 redirect from the old url? I have altered a url previously and the old url now goes to the home page - can't understand why. Anyone know what is best practice here? Thanks
Technical SEO | | kingwheelie0 -
GWT, URL Parameters, and Magento
I'm getting into the URL parameters in Google Webmaster Tools and I was just wondering if anyone that uses Magento has used this functionality to make sure filter pages aren't being indexed. Basically, I know what the different parameters (manufacturer, price, etc.) are doing to the content - narrowing. I was just wondering what you choose after you tell Google what the parameter's function is. For narrowing, it gives the following options: Which URLs with this parameter should Googlebot crawl? <label for="cup-crawl-LET_GOOGLEBOT_DECIDE">Let Googlebot decide</label> (Default) <label for="cup-crawl-EVERY_URL">Every URL</label> (the page content changes for each value) <label style="color: #5e5e5e;" for="cup-crawl-ONLY_URLS_WITH_VALUE">Only URLs with value</label> ▼(may hide content from Googlebot) <label for="cup-crawl-NO_URLS">No URLs</label> I'm not sure which one I want. Something tells me probably "No URLs", as this content isn't something a user will see unless they filter the results (and, therefore, should not come through on a search to this page). However, the page content does change for each value.I want to make sure I don't exclude the wrong thing and end up with a bunch of pages disappearing from Google.Any help with this is greatly appreciated!
Technical SEO | | Marketing.SCG0 -
Should XML sitemaps include *all* pages or just the deeper ones?
Hi guys, Ok this is a bit of a sitemap 101 question but I cant find a definitive answer: When we're running out XML sitemaps for google to chew on (we're talking ecommerce and directory sites with many pages inside sub-categories here) is there any point in mentioning the homepage or even the second level pages? We know google is crawling and indexing those and we're thinking we should trim the fat and just send a map of the bottom level pages. What do you think?
Technical SEO | | timwills0