Are all duplicate content issues bad? (Blog article Tags)
-
If so how bad?
We use tags on our blog and this causes duplicate content issues.
We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all.
Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings.
Before I do, can you give me some advice around this?
Thanks,
Daniel. -
Thanks David, at this stage I have set up site to no index follow tag pages. Thanks for blog usability feedback. Related posts is next on the list of to do for the blog.
-
I don't think standard tags get used much by visitors. Related posts, especially if acompanied by thumbnail images, perform much better in my experience.
-
Ok thanks guys, at this stage I think this is the way I'll go or as David says just use them to organise content and not display them.
Has anyone else found anything out there, articles, videos, anything on Moz that says Google is smart enough to deal with this so it is a non issue.
Also any thoughts on how important blog tags are these days for usability?
-
Agree 100% with David and Fredrico. Noindex, follow your tag pages.
-
Had the same issue myself, constant duplicate content reported on tag pages as sometimes some tag could have the same content on a page (while paginated).
We decided to no-index the tag pages, not only for the duplicate content issue, but also, they do not provide any extra to search engines, they are intended for users, then why have search engines indexing them? We added a no-index but NOT a nofollow as we WANT the pages they link to to be indexed (posts).
Sure, we lost about 7K indexed pages, but now those that remain are actually then ones that deserve to be there.
I'm with David on this one.
-
If you're using tags internally to help organise content, you could just stop them from appearing on the front end of the site.
The alternative is to keep them on the front end, but to no-index the tag pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO issues with masking blog domain?
We have a client who would like to move their Wordpress blog into a different server from their main site's server for security reasons. However, the blog is almost 10 years old with good traffic and rankings and we'd rather not have them change the domain. The developer has come back with a URL "masking" rule in .htaccess that will display the contents of the blog placed in the new server under a subdomain but still show the blog's original URL. If we block the new subdomain from indexing to avoid duplicate content - are there any SEO implications for doing this? Will Google see it as a deceptive practice and tank the blog's rankings? Any advice is greatly appreciated.
Intermediate & Advanced SEO | | roundabout0 -
Duplicate Content: Organic vs Local SEO
Does Google treat them differently? I found something interesting just now and decided to post it up http://www.daviddischler.com/is-duplicate-content-treated-differently-when-local-seo-comes-into-play/
Intermediate & Advanced SEO | | daviddischler0 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
News section of the website (Duplicate Content)
Hi Mozers One of our client wanted to add a NEWS section in to their website. Where they want to share the latest industry news from other news websites. I tried my maximum to understand them about the duplicate content issues. But they want it badly What I am planning is to add rel=canonical from each single news post to the main source websites ie, What you guys think? Does that affect us in any ways?
Intermediate & Advanced SEO | | riyas_heych0 -
Duplicate Content Warning For Pages That Do Not Exist
Hi Guys I am hoping someone can help me out here. I have had a new site built with a unique theme and using wordpress as the CMS. Everything was going fine but after checking webmaster tools today I noticed something that I just cannot get my head around. Basically I am getting warnings of Duplicate page warnings on a couple of things. 1 of which i think i can understand but do not know how to get the warning to go. Firstly I get this warning of duplicate meta desciption url 1: / url 2: /about/who-we-are I understand this as the who-we-are page is set as the homepage through the wordpress reading settings. But is there a way to make the dup meta description warning disappear The second one I am getting is the following: /services/57/ /services/ Both urls lead to the same place although I have never created the services/57/ page the services/57/ page does not show on the xml sitemap but Google obviously see it because it is a warning in webmaster tools. If I press edit on services/57/ page it just goes to edit the /services/ page/ is there a way I can remove the /57/ page safely or a method to ensure Google at least does not see this. Probably a silly question but I cannot find a real comprehensive answer to sorting this. Thanks in advance
Intermediate & Advanced SEO | | southcoasthost0 -
Mobile site version - Is it a duplication issue?
There is a blog www.blogname.com and someone creates 2 mobile versions: iphone.blogname.com mobile.blogname.com they are the perfect copy of www.blogname.com (articles, tags, links, etc etc) How Google will manage them? Right now, my article gets backlink by three sites www.blogname.com iphone.blogname.com mobile.blogname.com
Intermediate & Advanced SEO | | Greenman0 -
Accepting RSS feeds. Does it = duplicate content?
Hi everyone, for a few years now I've allowed school clients to pipe their news RSS feed to their public accounts on my site. The result is a daily display of the most recent news happening on their campuses that my site visitors can browse. We don't republish the entire news item; just the headline, and the first 150 characters of their article along with a Read more link for folks to click if they want the full story over on the school's site. Each item has it's own permanent URL on my site. I'm wondering if this is a wise practice. Does this fall into the territory of duplicate content even though we're essentially providing a teaser for the school? What do you think?
Intermediate & Advanced SEO | | peterdbaron0 -
SEO issues with IP based content delivery
Hi, I have two websites say website A and Website B. The website A is set up for the UK audience and the website B is set up for the US audience. Both websites sell same products with some products and offers not available in either country. Website A can't be accessed if you are in US. Similarly website B can't be accessed if you are in UK. This was a decision made by the client long time ago as they don’t want to offer promotions etc in the US and therefore don’t want the US audience to be able to purchase items from the UK site. Now the problem is both the websites have same description for the common products they sell.Search engine spiders tend to enter a site from a variety of different IP addresses/locations. So while a UK visitor will not be able to access the US version of the site and vice versa, a crawler can. Now i have following options with me: 1. Write a different product descriptions for US website to keep both the US and UK versions of the site in the Google Index for the foreseeable future. But this is going to be time consuming and expensive option as there are several hundred products which are common to both sites. 2. Use a single website to target both US and UK audience and make the promotions available only to the UK audience. There is one issue here. Website A address ends with '.co.uk' and website B has different name and ends with .com. So website A can't be used for the US audience. Also website A is older and more authoritative than the new website B. Also website A is pretty popular among UK audience with the .co.uk address. So website B can't be used to target the UK audience. 3. You tell me
Intermediate & Advanced SEO | | DevakiPhatak2