Are all duplicate content issues bad? (Blog article Tags)
-
If so how bad?
We use tags on our blog and this causes duplicate content issues.
We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all.
Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings.
Before I do, can you give me some advice around this?
Thanks,
Daniel. -
Thanks David, at this stage I have set up site to no index follow tag pages. Thanks for blog usability feedback. Related posts is next on the list of to do for the blog.
-
I don't think standard tags get used much by visitors. Related posts, especially if acompanied by thumbnail images, perform much better in my experience.
-
Ok thanks guys, at this stage I think this is the way I'll go or as David says just use them to organise content and not display them.
Has anyone else found anything out there, articles, videos, anything on Moz that says Google is smart enough to deal with this so it is a non issue.
Also any thoughts on how important blog tags are these days for usability?
-
Agree 100% with David and Fredrico. Noindex, follow your tag pages.
-
Had the same issue myself, constant duplicate content reported on tag pages as sometimes some tag could have the same content on a page (while paginated).
We decided to no-index the tag pages, not only for the duplicate content issue, but also, they do not provide any extra to search engines, they are intended for users, then why have search engines indexing them? We added a no-index but NOT a nofollow as we WANT the pages they link to to be indexed (posts).
Sure, we lost about 7K indexed pages, but now those that remain are actually then ones that deserve to be there.
I'm with David on this one.
-
If you're using tags internally to help organise content, you could just stop them from appearing on the front end of the site.
The alternative is to keep them on the front end, but to no-index the tag pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
May Faceted Navigation via ajax #parameter cause duplicated content issues?
We are going to implement a faceted navigation for an ecommerce site of about 1000 products.
Intermediate & Advanced SEO | | lcourse
Faceted navigation is implemented via ajax/javascript which adds to the URL a large number of #parameters.
Faceted pages are canonicalizing to page without any parameters. We do not want google to index any of the faceted pages at this point. Will google include pages with #parameters in their index?
Can I tell google somehow to ignore #parameters and not to index them?
Could this setup cause any SEO problems for us in terms of crawl bandwidth and or link equity?0 -
Duplicate Content For Product Alternative listing
Hi I have a tricky one here. cloudswave is a directory of products and we are launching new pages called Alternatives to Product X This page displays 10 products that are an alternative to product X (Page A) Lets say now you want to have the alternatives to a similar product within the same industry, product Y (Page B), you will have 10 product alternatives, but this page will be almost identical to Page A as the products are in similar and in the same industry. Maybe one to two products will differ in the 2 listings. Now even SEO tags are different, aren't those two pages considered duplicate content? What are your suggestions to avoid this problem? thank you guys
Intermediate & Advanced SEO | | RSedrati0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0 -
Rankings Up instantly after publishing new blog article
Hello All, My question is, why did my site rankings go up all of a sudden as soon as I published an article in my blog and it gets indexed (almost immidiately) The rankings increase are the following Keyword 1: from Top 5 to Top 1 Keyword 2: from Top 4 to Top 2 Keyword 3: from Top 16 to Top 3 Keyword 4: from no where to Page 2 (big time keyword) Any ideas or experience on this? When it first happened early this week(i published a new article), my site rankings dropped back after a couple of days. I did the same pattern by pubishing a new article and it went back up again. No personalized result here btw. Thank you
Intermediate & Advanced SEO | | onecov0 -
Joomla duplicate content
My website report says http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad and http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad?limitstart=0 Has the same content so I have duplicate pages the only problem is the ?limitstart=0 How can I fix this? Thanks in advance
Intermediate & Advanced SEO | | kuavicrea0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0