Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Wordpress Tag Pages - NoIndex?
-
Hi there.
I am using Yoast Wordpress Plugin. I just wonder if any test have been done around the effects of Index vs Noindex for Tag Pages? ( like when tagging a word relevant to an article )
Thanks

Martin
-
I agree with everyone above. I use tags for internal search which it is fantastic for, but I don't use it for SEO purposes
-
Yes. Absolutely correct. You should never index tag pages. These may be targeted but are of low quality with not much quality content. Moreover, they may have more keyword density than what you should have for a page to rank.
-
noindex,follow is common for these pages. You want Google to be able to navigate their way around the links and pages, but letting tag pages in create lots of additional and unnecessary pages into the index. You never find these pages ranking in Google even when they are open.
Choose a good, logical category structure, promote your blog posts and the pages will all be found.
-Andy
-
Thanks for the reply
On their own page: https://yoast.com/articles/wordpress-seo/
[excerpt]On smaller sites it might make sense to noindex either the category or the tag structure, but in my experience noindexing those on yoast.com does little to no change at all.
There is one type of archive that is noindex,follow by default as well in the WordPress SEO plugin: the search result pages. This is a best practice from Google for which a setting is left out as you should just have that anyway.
My thinking around this is that if people are looking for "Articles on topicx" then this would be more relevant to their search query because it's a collection of exactly what they are looking for - and more geared towards the 'long tail"
-
I have to agree with you Andy. Avoiding the hassle of having duplicate content issues makes it worth no-indexing.
-
Always noindex these where you can Martin as it avoids lots of duplicate pages being created. Yoast makes this nice and easy

-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is our noindex tag not working?
Hi, I have the following page where we've implemented a no index tag. But when we run this page in screaming frog or this tool here to verify the noidex is present and functioning, it shows that it's not. But if you view the source of the page, the code is present in the head tag. And unfortunately we've seen instances where Google is indexing pages we've noindexed. Any thoughts on the example above or why this is happening in Google? Eddy
Intermediate & Advanced SEO | | eddys_kap0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Adding hreflang tags - better on each page, or the site map?
Hello, I am wondering if there seems to be a preference for adding hreflang tags (from this article). My client just changed their site from gTLDs to ccTLDs, and a few sites have taken a pretty big traffic hit. One issue is definitely the amount of redirects to the page, but I am also going to work with the developer to add hreflang tags. My question is - is it better to add them to the header of each page, or the site map, or both, or something else? Any other thoughts are appreciated. Our Australia site, which was at least findable using Australia Google before this relaunch, is not showing up, even when you search the company name directly. Thanks!Lauryn
Intermediate & Advanced SEO | | john_marketade0 -
PDF or HTML Page?
One of our sales team members has created a 25 page word document as a topical page. The plan was to make this into an html page with a table of contents. My thoughts were why not make it a pdf? Is there any con to using a PDF vs an html page? If the PDF was properly optimized would it perform just as well? The goal is to have folks click back to our products and hopefully by after reading about how they work.
Intermediate & Advanced SEO | | Sika220 -
Using href lang tag for multi-regional targeting on the same page
Hi, I have the site au.example.com and I ranked on google AustraliaI would like to be ranked also in Google New Zeland for the same page (au.example.com) Because they are geographically & culturally close Can I place href lang tag for both countries and present the same page The code should look like: OR should i have create a different page for New Zealand (for eample: http://au.example.com/EN-NZ) And the code will look like: What will work better or there is other solution? Hope I’m clear.. Thanks!
Intermediate & Advanced SEO | | Kung_fu_Panda0 -
Meta NoIndex tag and Robots Disallow
Hi all, I hope you can spend some time to answer my first of a few questions 🙂 We are running a Magento site - layered/faceted navigation nightmare has created thousands of duplicate URLS! Anyway, during my process to tackle the issue, I disallowed in Robots.txt anything in the querystring that was not a p (allowed this for pagination). After checking some pages in Google, I did a site:www.mydomain.com/specificpage.html and a few duplicates came up along with the original with
Intermediate & Advanced SEO | | bjs2010
"There is no information about this page because it is blocked by robots.txt" So I had added in Meta Noindex, follow on all these duplicates also but I guess it wasnt being read because of Robots.txt. So coming to my question. Did robots.txt block access to these pages? If so, were these already in the index and after disallowing it with robots, Googlebot could not read Meta No index? Does Meta Noindex Follow on pages actually help Googlebot decide to remove these pages from index? I thought Robots would stop and prevent indexation? But I've read this:
"Noindex is a funny thing, it actually doesn’t mean “You can’t index this”, it means “You can’t show this in search results”. Robots.txt disallow means “You can’t index this” but it doesn’t mean “You can’t show it in the search results”. I'm a bit confused about how to use these in both preventing duplicate content in the first place and then helping to address dupe content once it's already in the index. Thanks! B0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Should I prevent Google from indexing blog tag and category pages?
I am working on a website that has a regularly updated Wordpress blog and am unsure whether or not the category and tag pages should be indexable. The blog posts are often outranked by the tag and category pages and they are ultimately leaving me with a duplicate content issue. With this in mind, I assumed that the best thing to do would be to remove the tag and category pages from the index, but after speaking to someone else about the issue, I am no longer sure. I have tried researching online, but there isn't anything that provided any further information. Please can anyone with any experience of dealing with issues like this or with any knowledge of the topic help me to resolve this annoying issue. Any input will be greatly appreciated. Thanks Paul
Intermediate & Advanced SEO | | PaulRogers0