Editing A Sitemap
-
Would there be any positive effect from editing a site map down to a more curated list of pages that perform, or that we hope they begin to perform, in organic search?
A site I work with has a sitemap with about 20,000 pages that is automatically created out of a Drupal plugin.
Of those pages, only about 10% really produce out of search. There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap.
For instance, would it focus Google's crawl budget more efficiently or have some other effect?
Your thoughts? Thanks! Best... Darcy
-
Hi Darcy
Looking at what has been mentioned previously I would agree with the train of thought that a more focussed sitemap would generally be advantageous.
Andrew
-
Hi Dmitrii,
Always fun to watch Matt's Greatest Hits, in this example the value of making things better.
I guess the make better or delete seems super black and white to me.
Economically, who is able to make thousands of pages dramatically better with compelling original content? So, instead, the only other option is apparently radical elective surgery and massive amputation? I guess I'd choose the chemo first and don't really see what the downside is for noindex/follow and exclude from the sitemap.
Anyway, thanks again! Best... Darcy
-
- I really read the above linked post differently than Google saying "just delete it."
Well, here is a video from Matt Cutts about thin content. In this particular video he's talking about websites, which already took hit for thin content, but in your case it's the same, since you're trying to prevent it
https://www.youtube.com/watch?v=w3-obcXkyA4&t=322So, there are two options he is talking about: delete or make it better. From your previous responses I understand that making it better is not an option, so there is only one option left
As for link juice thorough those pages. If those pages have good amount of links, traffic and are quite popular on your website, then surely DON'T delete them, but rather make them better. However, I understood that those pages are not popular or have much traffic, so, option two
-
Hi Thomas,
Thanks for the message.
To answer your question, part of the reason is link juice via a noindex/follow and then there are some pages that serve a very very narrow content purpose, but have absolutely no life in search.
All things being equal, do you think a smaller, more focused, sitemap is generally an advantage? In the extreme and on other sites I've seen sitemaps with noindexed pages on them.
Thanks... Darcy
-
Thanks for the suggestion, Andrew.
With setting priority or not in a sitemap, do you think a smaller, more focused, sitemap is generally an advantage?
Thanks... Darcy
-
Thomas & Dmitrii,
Thanks for the message. With all do respect, I really read the above linked post differently than Google saying "just delete it."
Also, I don't see how deleting it preserves whatever link juice those pages had, as opposed to a "noindex, follow" and taking them out of the sitemap.
Finally, I don't necessarily equate all of Google's suggestions as synonymous with a "for best effect in search." I assume their suggestions mean, "it's best for Google if you..."
Thanks, again!
Best... Darcy
-
You misunderstand the meaning of that article.
"...that when you do block thin or bad content, Google prefers when you use the noindex over 404ing the page..."
They are talking about the walk around the problem of blocking pages INSTEAD of removing them.
So, if for whatever reason you don't want to delete a page and just put a 404 status on it, it's worse than putting noindex on it. Basically, what they're saying is:
- if you have thin content, DELETE it;
- if for whatever reason you don't want to delete it, put NOINDEX on it.
P.S. My suggestion still stays the same. Delete all bad content and, if you really want, put 410 gone status for that deleted content for Google to understand immediately that those pages are deleted forever, not inaccessible by mistake or something.
Hope this makes sense
.
-
Darcy,
Whilst noindex would be a good solution, if the page has no benefit why would you noindex instead of deleting it?
-
Dmitrii & Thomas,
Thanks for your thoughts.
Removal would be one way to go. I note with some interest this post:
https://www.seroundtable.com/google-block-thin-content-use-noindex-over-404s-21011.html
According to that, removal would be the third thing after making it better and noindexing.
With thousands of pages, making it better is not really an option.
Best... Darcy
-
Hi Darcy
I don't know about scaling the sitemap down but you could make use of an area of the sitemap to optimise and make it a crawl more efficient.
The area in question is the Priority area that basically tells the search engines which pages on your site are the most important. The theory is that pages with a higher priority (say 100%) are more likely to get indexed by the search engines than pages with a lower priority of say (10%), although not everyone in the industry agrees.
-
"There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap."
Why not remove these from the site?
I personally believe that it'll have a positive impact, as you're submitting this sitemap to Google, you're giving it a way of going through your whole site, so why would you give it low quality pages. You want to provide Google (and your users) the best possible experience, so if you've got out of date pages, update them or if they're not relevant delete them, a user who lands on this page anyway would just bounce because it's not relevant anymore.
If these out of date pages can't be found by crawling, then 100% it's best to craft your sitemap to show the best pages.
-
hi there.
Of those pages, only about 10% really produce out of search. There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap.
Have you considered removing those pages/sections, rather than altering the sitemap? It would make more sense I think.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL change - Sitemap update / redirect
Hi everyone Recently we performed a massive, hybrid site migration (CMS, URL, site structure change) without losing any traffic (yay!). Today I am finding out that our developers+copy writers decided to change Some URLs (pages are the same) without notifying anyone (I'm not going into details why). Anyhow, some URLs in site map changed, so old URLs don't exist anymore. Here is the example: OLD (in sitemap, indexed): https://www.domain.com/destinations/massachusetts/dennis-port NEW: https://www.domain.com/destinations/massachusetts/cape-cod Also, you should know that there is a number of redirects that happened in the past (whole site) Example : Last couple years redirections: HTTP to HTTPS non-www to www trailing slash to no trailing slash Most recent (a month ago ) Site Migration Redirects (URLs / site structure change) So I could add new URLs to the sitemap and resubmit in GSC. My dilemma is what to do with old URL? So we already have a ton of redirects and adding another one is not something I'm in favor of because of redirect loops and issues that can affect our SEO efforts. I would suggest to change the original, most recent 301 redirects and point to the new URL ( pre-migration 301 redirect to newly created URL). The goal is not to send mixed signals to SEs and not to lose visibility. Any advice? Please let me know if you need more clarification. Thank you
Intermediate & Advanced SEO | | bgvsiteadmin0 -
Client wants to remove mobile URLs from their sitemap to avoid indexing issues. However this will require SEVERAL billing hours. Is having both mobile/desktop URLs in a sitemap really that detrimental to search indexing?
We had an enterprise client ask to remove mobile URLs from their sitemaps. For their website both desktop & mobile URLs are combined into one sitemap. Their website has a mobile template (not a responsive website) and is configured properly via Google's "separate URL" guidelines. Our client is referencing a statement made from John Mueller that having both mobile & desktop sitemaps can be problematic for indexing. Here is the article https://www.seroundtable.com/google-mobile-sitemaps-20137.html
Intermediate & Advanced SEO | | RosemaryB
We would be happy to remove the mobile URLs from their sitemap. However this will unfortunately take several billing hours for our development team to implement and QA. This will end up costing our client a great deal of money when the task is completed. Is it worth it to remove the mobile URLs from their main website to be in adherence to John Mueller's advice? We don't believe these extra mobile URLs are harming their search indexing. However we can't find any sources to explain otherwise. Any advice would be appreciated. Thx.0 -
International Sitemaps
Hey Dudes, Quick question about international sitemaps. Basically we have a mix of subfolders, subdirectories, and ccTLDs for our different international/language sites. With this in mind how do you recommend we set up the site map. I'm thinking the best solution would be to move the subfolders and subdirectories onto an index and put the ccTLD site maps on their own root only. domain.ca/sitemap (This would only contain the Canada pages) domain.com, fr.domain.com, domain.com/eu/ (These pages would all have an index on domain.com/sitemap that points to each language/nations index) OR Should all site have a site map under their area. domain.com/sitemap, fr.domain.com/sitemap, domain.com/eu/sitemap, domain.ca/sitemap? I'm very new to international SEO. I know that our current structure probably isn't ideal... but it's what I've inherited. I just want to make sure I get a good foundation going here. So any tips are much appreciated!
Intermediate & Advanced SEO | | blake.runyon0 -
Sitemap Indexation
When we use HTML sitemap. Many a times i have seen that the sitemap itself gets mapped to keywords which it shouldn't have got to. So should we keep the HTML sitemap as No-Index, Follow or does anyone has a better solution that the sitemap doesn't show-up for other keyword terms that actually isn't representing this page.
Intermediate & Advanced SEO | | welcomecure0 -
Total Indexed 1.5M vs 83k submitted by sitemap. What?
We recently took a good look at one of our content site's sitemap and tried to cut out a lot of crap that had gotten in there such as .php, .xml, .htm versions of each page. We also cut out images to put in a separate image sitemap. The sitemap generated 83,000+ URLs for google to crawl (this partially used the Yoast Wordpress plugin to generate) In webmaster tools in the index status section is showing that this site has a total index of 1.5 million. With our sitemap coming back with 83k and google indexing 1.5 million pages, is this a sign of a CMS gone rogue? Is it an indication that we could be pumping out error pages or empty templates, or junk pages that we're cramming into Google's bot? I would love to hear what you guys think. Is this normal? Is this something to be concerned about? Should our total index more closely match our sitemap page count?
Intermediate & Advanced SEO | | seoninjaz0 -
How Do I Generate a Sitemap for a Large Wordpress Site?
Hello Everyone! I am working with a Wordpress site that is in Google news (i.e. everyday we have about 30 new URLs to add to our sitemap) The site has years of articles, resulting in about 200,000 pages on the site. Our strategy so far has been use a sitemap plugin that only generates the last few months of posts, however we want to improve our SEO and submit all the URLs in our site to search engines. The issue is the plugins we've looked at generate the sitemap on-the-fly. i.e. when you request the sitemap, the plugin then dynamically generates the sitemap. Our site is so large that even a single request for our sitemap.xml ties up tons of server resources and takes an extremely long time to generate the sitemap (if the page doesn't time out in the process). Does anyone have a solution? Thanks, Aaron
Intermediate & Advanced SEO | | alloydigital0 -
Sitemap.xml
Hi guys I read the seomoz article about sitemap.xml dated 2008. Just wanted to check views on: Is it worthwhile using the 'priority' What if everything is set to 100% Any tips to using the priority Many thanks in advance! Richard
Intermediate & Advanced SEO | | Richard5550 -
Which is best structure for Multiple XML Sitemap?
I have read such a great blog posts on Multiple XML Sitemaps on following websites before a week. SEOmoz Distilled Google Webmaster Central Blog Search Engine Land SEO Inc I have created multiple XML sitemaps for my eCommerce website with following structure and submitted to Google webmaster tools. http://www.vistastores.com/main_sitemap.xml http://www.vistastores.com/products_sitemap.xml But, I am not satisfy with my second XML sitemap because it contain more than 7K+ product page URLs and looks like very slow crawling by Google! I want to separate my XML sitemap with following structure. With Root Level Category http://www.vistastores.com/outdoor_sitemap.xml http://www.vistastores.com/furniture_sitemap.xml http://www.vistastores.com/kitchen_dining_sitemap.xml http://www.vistastores.com/home_decor_sitemap.xml OR::: End Level Category http://www.vistastores.com/table_lamps_sitemap.xml http://www.vistastores.com/floor_lamps_sitemap.xml . . . . . . . etc.... So, Which is best structure for Multiple XML Sitemap?
Intermediate & Advanced SEO | | CommercePundit0