Include or exclude noindex urls in sitemap?
-
We just added tags to our pages with thin content.
Should we include or exclude those urls from our sitemap.xml file? I've read conflicting recommendations.
-
Hi vcj and the rest of you guys
I would be very interested in learning what strategy you actually went ahead with, and the results. I have a similar issue as a result of pruning, and removing noindex pages from the sitemap makes perfect sense to me. We set a noindexed follow on several thousand pages without product descriptions/thin content and we have set things up so when we add new descriptions and updated onpage elements, the noindex is automatically reversed; which sounds perfect, however hardly any of the pages to date (3000-4000) are indexed, so looking for a feasible solution for exactly the same reasons as you.
We have better and comparable metrics and optimization than a lot of the competition, yet rankings are mediocre, so looking to improve on this.
It would be good to hear your views
Cheers
-
I'm aware of the fact Google will get to them sooner or later.
The recommendation from Gary Illyes (from Google), as mentioned in this post, was the reason for my asking the question. Not trying to outsmart Google, just trying to work within their guidelines in the most efficient way possible.
-
Just to put things into perspective,
if these URLs are all already indexed and you have used "noindex" on those pages, sooner or later google will re-crawl these pages and they will be removed. You may want to remove them from the index ASAP for some reason, but it wont really change anything. Because Google will not deindex your noindex pages just because they are in your sitemap.xml.
Google deindexes a sie only when it is time to re-crawl the page.Google never recommends using noindex in sitemaps, and google wont suggest that in their blocking search indexing results guidelines. Also Google indicates the following:
"Google will completely drop the page from search results, even if other pages link to it. If the content is currently in our index, we will remove it after the next time we crawl it. (To expedite removal, use the Remove URLs tool in Google Webmaster Tools.)"But hey! every SEO has its own take.. Some tend to try outsmart Google some not..
Good luck
-
That opens up other potential restrictions to getting this done quickly and easily. I wouldn't consider it best practices to create what is essentially a spam page full of internal links and Googlebot will likely not crawl all 4000 links if you have them all there. So now you'd be talking about maybe making 20 or so thin, spammy looking pages of 200+ internal links to hopefully fix the issue.
The quick, easy sounding options are not often the best option. Considering you're doing all of this in an attempt to fix issues that arose due to an algorithmic penalty, I'd suggest trying to follow best practices for making these changes. It might not be easy but it'll lessen your chances of having done a quick fix that might be the cause, or part of, a future penalty.
So if Fetch As won't work for you (considering lack of manpower to manually fetch 4000 pages), the sitemap.xml option might be the better choice for you.
-
Thanks, Mike.
What are your thoughts on creating a page with links to all of the pages we've Noindexed, doing a Fetch As and submitting that URL and its linked pages? Do you think Google would dislike that?
-
You could technically add them to the sitemap.xml in the hopes that this will get them noticed faster but the sitemap is commonly used for the things you want Google to crawl and index. Plus, placing them in the sitemap does not guarantee Google is going to get around to crawling your change or those specific pages. Technically speaking, doing nothing and jut waiting is equally as valid. Google will recrawl your site at some point. Sitemap.xml only helps if Google is crawling you to see it. Fetch As makes Google see your page as it is now which is like forcing part of a crawl. So technically Fetch As will be the more reliable, quicker choice though it will be more labor-intensive. If you don't have the man-hours to do a project like that at the moment, then waiting or using the Sitemap could work for you. Google even suggests using Fetch As for urls you want them to see that you have blocked with meta tags: https://support.google.com/webmasters/answer/93710?hl=en&ref_topic=4598466
-
There are too many pages to do that (unless we created a page with links to all of the Noindexed pages, then asked Google to crawl that and all linked pages, though that seems like it might be a bad approach). It's an ecommerce website and we Noindexed nearly 4,000 pages that had thin or duplicate content (manufacturer descriptions, no description on brand page, etc) and had no organic traffic in the past 90 days.
This site was hit by Panda in September 2014 and isn't ranking for things it should be – pages with better backlink profiles, higher DA/PA, better content, etc. than our competitors. Our thought is we're not ranking because of a penalty against thin/duplicate content. So we decided to Noindex these pages, improve the content on products that are selling and getting traffic, then work on improving pages that we've Noindex before switching them back to Index.
Basically following recommendations from this article: https://moz.com/blog/pruning-your-ecommerce-site
-
If the pages are in the index and you've recently added a NoIndex tag with the express purpose of getting them removed from the index, you may be better served doing crawl requests in Search Console of the pages in question.
-
Thanks for your response!
I did some more digging. This seems to contradict your suggestion:
https://twitter.com/methode/status/653980524264878080
If the goal is to have these pages removed from the index, and having them in the sitemap means they'll be picked up sooner by Google's crawler, then it seems to make sense that they should be included until they're removed from the index.
Am I misinterpreting this?
-
Hi
The reason you submit a sitemap to a searchengine is to ease and aid in crawling process for the pages that you want to get indexed. It speeds up the crawling process and lets search engine to discover all those pages that has no inner linkings to it etc..
A "noindex" tag does the opposite.
So no, you should not include noindex pages inside your sitemap files.
In general you should avoid pages that are not returning 200 also.Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Size effect SEO
So I've noticed that the sitemap I use has a capacity of 4500 URLs, but my website is much larger. Is it worth paying for a commercial sitemap that encompasses my entire site? I also notice that of the 4500 URLs which have been submitted, only 104 are indexed. Is this normal, if not, why is the index rate so low?
Technical SEO | | moon-boots0 -
Trailing Slashes on URLs
Hi everyone I have a question on trailing slashes in URL. The crux of it is this: is having both: example.com/subdirectory/ and: example.com/subdirectory on all of your subdirectories considered duplicate content by Google - or in some other way really bad? We have done a heck a lot of research into this, and it would seem...no one knows for sure (it is easy to get lost in a sea of Webmaster tool forums from 2012). Google itself has both URLs for it's subdirectories (try https://www.google.co.uk/maps and https://www.google.co.uk/maps/) as does Moz; and yet there are some rumblings on the internet of people who think you must put a 'redirect' (although not really a redirect as it isn't a 301) in your htaccess file to one or the other (so for example.com/subdirectory/ would 'forward' to example.com/subdirectory); and this is what bbc.co.uk do. We tried putting this htaccess 'forward' in as an experiment, but I noticed our site then stopped being fully crawled by Google bot, so we reversed it. Can any one shed any light?
Technical SEO | | NickOrbital0 -
How could i create sitemap with 1000 page and should i update sitemap frequently?
My website have over 1000 pages but the sitemap creator tools i knew only create maximum 500 pages, how could i create sitemap with full of my webpage?
Technical SEO | | magician0 -
How can I see the SEO of a URL? I need to know the progress of a specific landing-page of my web. Not a keyword, an url please. Thanks.
I need to know the evolution on SEO of a specific landing-page (an URL) of my web. Not a keyword, a url. Thanks. (Necesito saber si es posible averiguar el progreso de una URL específica en el posicionamiento de Google. Es decir, lo que hace SEOmoz con las palabras clave pero al revés. Yo tengo una url concreta que quiero posicionar en las primeras posiciones de Google pero quiero ver cómo va progresando en función a los cambios que le voy aplicando. Muchas gracias)
Technical SEO | | online_admiral0 -
Noindex Pages indexed
I'm having problem that gogole is index my search results pages even though i have added the "noindex" metatag. Is the best thing to block the robot from crawling that file using robots.txt?
Technical SEO | | Tedred0 -
Block url with dynamic text in
I've just ran a report and I have a lot of duplicate page titles, most of which seem to be the review page, I use Magento and my normal url would be something like blah-blahtext.html but the review url is something like blah-blahtext/reviews/category/categoryname So I want to block the /reviews url bit as no one ever leaves reviews and it's not something I will be using in the future. Also I have a dynamic navigation which creates urls that look like product-name.html?size=2&colour=14 these are also creating duplicate urls, anyway to fix this? While I'm asking, anyone any tips for Magento?
Technical SEO | | Beermonster0 -
How does a sitemap affect the definition of canonical URLs?
We are having some difficulty generating a sitemap that includes our SEO-friendly URLs (the ones we want to set as canonical), and I was wondering if we might be able to simply use the non-SEO-friendly, non-canonical URLs that the sitemap generator has been producing and then use 301 redirects to send them to the canonical. Is there a reason why we should not be doing this? We don't want search engines to think that the sitemap URLs are more important than the pages to which they redirect. How important is it that the sitemap URLs match the canonical URLs? We would like to find a solution outside of the generation of the sitemap itself as we are locked into using a vendor’s product in order to generate the sitemap. Thanks!
Technical SEO | | emilyburns0 -
What should be noindexed on a Wordpress blog?
I know this can be a "it depends" answer so I'll try to explain. Qualifications on your answers would be great. I use the Wordpress architecture for myself and clients on sites and blogs. Almost every business site we create has a blog and I'm always working to improve results on them. My strategy has been the following: Categories: General, main content types, general keywords. Index, follow Tags: Very specific, post specific, may only be used once for one post. My categories have descriptions that are displayed on the category pages with excerpts. Tags rarely have a description but are displayed with excerpts on the page. My idea has been to index the categories to crawl the content and they have unique content by showing the category description. Tags shouldn't be archived because they may be all over the place and may have only 1 post with no tag description. I'm trying to reduce duplicate content but I don't want to limit results for my clients and myself. Should I set tags to noindex, follow or should I have them indexed? The only thing I'm thinking with having the tags indexed is that I may be able to get additional traffic through the more specific tags (i.e. tag = meta tags, category = SEO).
Technical SEO | | JaredDetroit0