Tags, Categories, & Duplicate Content
-
Looking for some advice on a duplicate content issue that we're having that definitely isn't unique to us.
See, we are allowing all our tag and category pages, as well as our blog pagination be indexed and followed, but Moz is detecting that all as duplicate content, which is obvious since it is the same content that is on our blog posts.
We've decided in the past to keep these pages the way they are as it hasn't seemed to hurt us specifically and we hoped it would help our overall ranking. We haven't seen positive or negative signals either way, just the warnings from Moz.
We are wondering if we should noindex these pages and if that could cause a positive change, but we're worried it might cause a big negative change as well.
Have you confronted this issue? What did you decide and what were the results?
Thanks in advance!
-
Erica, Thank you for sticking with this and continuing to share your thoughts. It's very helpful and much appreciated!
-
Thanks Erica.
We're deindexing the tags pages for now and going to see what happens. If all goes well, we might deindex the category pages as well.
Thanks!
-
EGOL is definitely correct that those pages can hold a ton of value if a brand/company has the time/resources/bandwidth to optimize them. Most don't, so it's better to noindex than have duplicate and/or thin content category pages. But if you can and will optimize, do it!
-
That makes sense. But I really want to make sure I (and others) understand because of EGOL's earlier referenced comments (June 2011).
"If I kept my category pages out of the search indexes I would be walking away from hundreds of search engine visitors per minute.
Do analytics to see how much traffic is coming into these pages from search, who is linking to them, how much revenue they earn and also consider their future traffic potential.
Its not good to follow generalized advice blindly." and (February 2012) ...
"I have two wordpress blogs and category pages are where most of my search engine traffic enters. Some bring in thousands per month. Most of my post pages bring in very little traffic.
If you are not having any problem with duplicate content at present maybe it would be a good idea to allow indexing of the main page, the post pages and the category pages. They if you do have a duplicate content problem you can remove from the index the pages that bring in the least amount of traffic."
So is the key then, ensuring the category pages contain unique content in addition to whatever else is on the category pages? I would have thought the mere fact that you're creating a unique combination of unique content by the grouping excerpts from identically tagged posts might have been enough. That content would also get updated each time a new post gets published.
I'd appreciate your thoughts on this Erica.
-
You can either choose to deindex pages one by one or deindex the whole subfolder.
Since usually category pages have the same content as or a preview of the content on your other pages, this doesn't affect your long tail traffic as that traffic will go to the other pages. Usually the problem with category pages is that the content's thin or duplicate. Now, you can make content just for category pages and keep them to drive traffic to. I worked in e-commerce pre-Moz and we wanted to rank/land people on category pages, such as women's shirts, and made unique, solid content for those page.
-
This is certainly what we've heard and it's good to hear of a real case where you went from indexing to noindexing. My bet is that we would have the same result, my hope is that we would have an increase over time, and my fear is that we'll have a decrease.
-
We do upgrade our blog twice a week and keep a pretty good spread across our categories and tags.
The hope is that we'll have the boost you mentioned, in long-tail and whatnot, but the fear is that it could hurt us (like Erica mentioned above).
Like Erica, we haven't seen any negativity, but we wonder if we're being affected without even knowing it and by setting them to noindex we could potentially get a boost. It's a dream, we just don't want the opposite to happen.
-
Erica, shouldn't the decision to noindex category pages be done on a case-by-case basis? If the blog has few posts, or if posts aren't updated frequently, then the chance of category pages being viewed as thin increases and it would make sense to noindex them.
If, on the other hand:
- category pages have different content from that of the main blog page;
- the main blog and category pages use excerpts;
- tag, archive and author pages are noindexed;
- and frequent updates;
doesn't it then make a case to index category pages? They can be a rich source of long-tail keywords and therefore a good draw for new entrants to the site as explained in this earlier Q&A post.
-
It's highly recommend that you noindex category, tag, archives, and author pages in WordPress. (I assume you're using WP; though there are many similar blogging platforms out there.) The reason is because these pages come across as thin and/or duplicate content, and you are risking getting hit by Panda. Now that doesn't always happen. My own personal blog had these pages indexed for a very long time, and I didn't have any problems. But I also didn't seen any problems when I did deindex them. But I don't get a ton of traffic, and I'm sure traffic to, popularity of site, and competitive nature all factor into Google's radar.
-
Hi Bradjn, With duplicate content i would go for the use of canonicals. Give the original page and its duplicates the same cannonical url, so searchengines will know what's the original and (most of the time) won't see it as duplicate.
Here some more about duplicate content and also canonicals: http://moz.com/learn/seo/duplicate-content
I can't give you a good answer about using noindex or this wil be a positive change. Did you check in webmastertools about duplicates? or only MOZ?
Grtz, Leonie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Duplicated content & url's for e-commerce website
Hi, I have an e-commerce site where I sell greeting cards. Products are under different categories (birthday, Christmas etc) with subcategories (for Mother, for Sister etc) and same product can be under 3 or 6 subcategories, for example: url: .../greeting-cards/Christmas/product1/for-mother
Technical SEO | | jurginga
url:.../greeting-cards/Christmas/product1/for-sister
etc On the CMS I have one description record per each card (product1) with multiple subcategories attached that naturally creates URLs for subcategories. Moz system (and Google for sure) picks these urls (and content) as duplicated.
Any ideas how to solve this problem?
Thank you very much!0 -
Duplicate content through product variants
Hi, Before you shout at me for not searching - I did and there are indeed lots of threads and articles on this problem. I therefore realise that this problem is not exactly new or unique. The situation: I am dealing with a website that has 1 to N (n being between 1 and 6 so far) variants of a product. There are no dropdown for variants. This is not technically possible short of a complete redesign which is not on the table right now. The product variants are also not linked to each other but share about 99% of content (obvious problem here). In the "search all" they show up individually. Each product-variant is a different page, unconnected in backend as well as frontend. The system is quite limited in what can be added and entered - I may have some opportunity to influence on smaller things such as enabling canonicals. In my opinion, the optimal choice would be to retain one page for each product, the base variant, and then add dropdowns to select extras/other variants. As that is not possible, I feel that the best solution is to canonicalise all versions to one version (either base variant or best-selling product?) and to offer customers a list at each product giving him a direct path to the other variants of the product. I'd be thankful for opinions, advice or showing completely new approaches I have not even thought of! Kind Regards, Nico
Technical SEO | | netzkern_AG0 -
Duplicate content issue
Moz crawl diagnostic tool is giving me a heap of duplicate content for each event on my website... http://www.ticketarena.co.uk/events/Mint-Festival-7/ http://www.ticketarena.co.uk/events/Mint-Festival-7/index.html Should i use a 301 redirect on the second link? i was unaware that this was classed as duplicate content. I thought it was just the way the CMS system was set up? Can anyone shed any light on this please. Thanks
Technical SEO | | Alexogilvie0 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Duplicate content problem?
Hello! I am not sure if this is a problem or if I am just making something too complicated. Here's the deal. I took on a client who has an existing site in something called homestead. Files cannot be downloaded, making it tricky to get out of homestead. The way it is set up is new sites are developed on subdomains of homestead.com, and then your chosen domain points to this subdomain. The designer who built it has kindly given me access to her account so that I can edit the site, but this is awkward. I want to move the site to its own account. However, to do so Homestead requires that I create a new subdomain and copy the files from one to the other. They don't have any way to redirect the prior subdomain to the new one. They recommend I do something in the html, since that is all I can access. Am I unnecessarily worried about the duplicate content consequences? My understanding is that now I will have two subdomains with the same exact content. True, over time I will be editing the new one. But you get what I'm sayin'. Thanks!
Technical SEO | | devbook90 -
Duplicated content on subcategory pages: how do I fix it?
Hello Everybody,
Technical SEO | | uMoR
I manage an e-commerce website and we have a duplicated content issue for subcategory. The scenario is like this: /category1/subcategory1
/category2/subcategory1
/category3/subcategory1 A single subcategory can fit multiple categories, so we have 3 different URL for the same subcategory with the same content (except of the navigation link). Which are the best practice to avoid this issue? Thank you!0 -
Different TLD's same content - duplicate content? - And a problem in foreign googles?
Hi, Operating from the Netherlands with customers troughout Europe we have for some countries the same content. In the netherlands and Belgium Dutch is spoken and in Germany and Switserland German is spoken. For these countries the same content is provided. Does Google see this as duplicate content? Could it be possible that a german customer gets the Swiss website as a search result when googling in the German Google? Thank you for your assistance! kind regards, Dennis Overbeek Dennis@acsi.eu
Technical SEO | | SEO_ACSI0