Tags creating duplicated content issue?
-
Hello i believe a lot of us use tags in our blogs as a way to categorize content and make it easy searchable but this usually (at lease in my case) cause duplicate content creation.
For example, if one article has 2 tags like "SEO" & "Marketing", then this article will be visible and listed in 2 urls inside the blog like this
In case of a blog with 300+ posts and dozens of different tags this is creating a huge issue.
My question is 1. Is this really bad? 2. If yes how to fix it without removing tags?
-
I have different meta content since a long time still showing as a duplicate and on just looking at the body content it is identical. Is there any quick way I can manually add something to the robots file to take the duplicates away? Canonical is not working for me as it just points to the same url - not the MAIN one you want. So there is nothing as good as yoast for joomla, they should make that and make a lot of money! OOTB joomla is poor at seo if you dont know how to make menus in joomla your site can have massive issues. Without a tool like MOZ you may never know why your quality content can't rank - gee thanks joomla
-
Hi
Ahhhh... gotcha thought it was wordpress
Your best bet is to have a unique description generated in Joomla for each tag archive. Robots.txt won't necessarily remove the URLs from Google. If you want to deindex them, you need to use meta robots tag.
Anyhow, hope that got you in the right direction!
-Dan
-
Dear Dan,
Thank you so much for spending time on our issue and on the advice. Im looking forward to read your article.
Unfortunetly our blog for technical purposes is not in Wordpress but in Joomla, so i will look for a similar solution there. The desperate solution i guess is to disallow tag urls in robots.txt. But i would try avoid that. On the other hand, since i also use categories to index the content then i assume this will not generate any issue of hiding content.
-
Hey Guys
Again, whether full posts or excerpts are being shown for tag archives, is important (I would vote on excerpts) but see my answer above. The tag archives all have the same description. That's where Moz is likely getting the duplicate errors from - and not likely because of tag pages being similar to post or category pages.
The quick fix on this is to use an SEO plugin like Yoast and create a description template for the tag archives.
But BEST case scenario in a perfect setup, would be have tags totally unique from categories, and not index tag archives at all.
Canonicals should only be used sparingly and when no other measure can be taken.
It also seems this is not the best theme, so there are other issues at play as well, too many to go through in just a Q&A format.
-Dan
-
Hi
Just want to add two cents to this... a canonical should really be the last resort if it can't be resolved with robots meta, url structure, or content.
The issue here is that Moz is bringing back duplicate content errors because the tags all have the same description. This can be fixed (as noted in my full answer) by creating a description template for tag archives with a plugin like Yoast SEO.
The canonical may not resolve anything because the tag pages at best shouldn't be indexed to begin with - and if they are indexed, the descriptions should be unique.
-Dan
Edit - just realized they are using Joomla. The same can apply, but I'm not as familiar with Joomla, so if there's a way to create descriptions for the tags with Joomla that's the best bet still.
-
Hi!
Just need to clear things up here, sorry I'm a little late to responding!
1. Quick Fix - Create a description template for tag archives
You're getting duplicate errors because your tag archives all have the same meta description. Use an SEO plugin like Yoast SEO for wordpress something for Joomla and create a template for your tag descriptions. This will give each tag archive a unique description and eliminate the duplicate errors.
2. Long Term Fix - Root of The Problem
The real ROOT of the issue, is a combination of maybe a poor theme, no SEO plugin (that I can see) and tag pages being used incorrectly.
-
Tags should be completely different than any categories
-
And as standard practice I NOINDEX tags. Because there content is so similar to other pages, and it also may not be the best user experience. There may be exceptions to this but its a general rule I follow.
Now, with that said, don't just go deindexing your tag archives.
Tomorrow (May 8th 2012), I have an extensive article going up on the Moz blog about WordPress and duplicate content. I suggest reading that article to get a good understanding of how all the elements work. And perhaps in the long term you can work towards a more robust WordPress setup. But for now, no harm done the way it is.
Hope that helps!
-Dan
Edit - Realized they are using Joomla. The same concepts apply, but with a technical implementation that works with Joomla (which I am not as familiar with).
-
-
I think you should be good leaving it alone, then.
You could put rel=canonical on the post page only (don't put it on the tags or category pages) but that might be more trouble than it's worth, depending on the restrictions imposed by the CMS.
-
I don't believe the actual tag pages are the issue here. It's the fact that the same page can be accessed by 3 different url's because of the tags it's under. Canonical links will take care of this.
-
I am not sure if it's possible with the publishing system you are working with, but there are CMS systems on the market who have solved this issue.
They have done the following approach:
Create your main Article, Blog etc., tag them with your keywords and on your keyword page show the Article content as a teaser with a ''Read More" link to the full content page.
This is not considered as duplicate content!
Hope this helps!
-
Hi Pantelis,
I think that whether or not this is a problem, and how it should be fixed, depends on how your blog is set up.
The guide Justin mentioned is a good resource. Before you jump in, I think you should consider these questions:
When you go to domain.com/blog/seo etc. are the posts excerpted, or are full posts being displayed?
When someone clicks on the title of a blog post having found it under a tag (e.g. going to domain.com/blog/marketing and clicking on one of the posts) what URL is being displayed for the individual post?
e.g. is it domain.com/blog/seo/great-post-1 or is it domain.com/blog/great-post-1 ?
What really matters for duplicate content and canonicalization is whether the URL for the individual blog post is unique.
If the blog post has one unique URL, no matter how you get to it, and if the tags pages are displaying excerpts, then the only place you should be using rel=canonical is on the blog post itself. I think putting rel=canonical on a tags page that's only displaying titles and excerpts is asking for trouble. I don't like the idea of the search engines potentially thinking that your tag page, which has partials of many posts, is the original source.
If you're displaying full blog posts on the tags pages, then the solution is probably to switch it to excerpts and canonicalize only the individual blog posts.
Reference the SEOmoz blog: The SEOmoz.org/blog page doesn't use rel=canonical, and only displays excerpts, while seomoz.org/blog/post-title uses rel=canonical and displays the full post.
-
Its not really bad, but there is every chance it will affect your rankings as google will not know which page is dominant and in turn will not know which version it should show to searchers
the best method of resolving the issue is to use the rel=canonical tag as this allows you to tell google which page is the dominant version
see article here for more details:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Moz Crawl Shows Duplicate Content Which Doesn't Seem To Appear In Google?
Morning All, First post, be gentle! So I had Moz crawl our website with 2500 high priority issues of duplicate content, not good. However if I just do a simple site:www.myurl.com in Google, I cannot see these duplicate pages....very odd. Here is an example....
On-Page Optimization | | scottiedog
http://goo.gl/GXTE0I
http://goo.gl/dcAqdU So the same page has a different URL, Moz brings this up as an issue, I would agree with that. However if I google both URL's in Google, they will both bring up the same page but with the original URL of http://goo.gl/zDzI7j ...in other words, two different URL's bring up the same indexed page in Google....weird I thought about using a wildcard in the robots.txt to disallow these duplicate pages with poor URL's....something like.... Disallow: /*display.php?product_id However, I read various posts that it might not help our issues? Don't want to make things worse. On another note, my colleague paid for a "SEO service" and they just dumped 1000's of back-links to our website, of course that's come back to bite us in the behind. Anyone have any recommendations for a good service to remove these back-links? Thanks in advance!!0 -
Duplicate Issue
Hello Mozzers! We have a client going through a website revamp. The client is The Michelangelo Hotel, and they are part of Star Hotels. Star Hotels plans to create a section on their site for The Michelangelo, as opposed to maintaining a stand alone site. They will then take the michelangelohotel.com domain, and point it to the corresponding pages on the Star site. The guest will key in www.michelangelohotel.com, and will see the same content that can be found on www.starhotel.com/en/michelangelo-hotel-new-york. The problem we have is this: Essentially the same content will be indexed twice, once on starhotels.com and once on michelangelohotel.com. This would seem to cause a duplicate content issue. What are your thoughts? Edit: I apologize, because I was not nearly clear enough here. The Star Hotels site will have 5 pages dedicated to The Michelangelo Hotel. The content will sit solely on that server as those 5 pages. Those 5 pages will each be indexed as 2 URLs. www.michelangelohotel.com <-> www.starhotels.com/en/michelangelo/ www.michelangelohotel.com/accommodations <-> www.starhotels.com/en/michelangelo/accommodations And so on. Thanks!
On-Page Optimization | | FrankSweeney0 -
Duplicate content on partner site
I have a trade partner who will be using some of our content on their site. What's the best way to prevent any duplicate content issues? Their plan is to attribute the content to us using rel=author tagging. Would this be sufficient or should I request that they do something else too? Thanks
On-Page Optimization | | ShearingsGroup0 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0 -
How Can I Fix Adobe Bridge Photo Galleries and Duplicate Content?
I have used the Adobe bridge program for a number of photo galleries on a remodeling site and it is showing a large amount of duplicate titles, etc. Is there an easy fix to this? anyone?
On-Page Optimization | | DaveBrown3330 -
Is rel=canonical used only for duplicate content
Can the rel-canonical be used to tell the search engines which page is "preferred" when there are similar pages? For instance, I have an internal page that Google is showing on the first page of the SERPs that I would prefer the home page be ranked for. Both the home and internal page have been optimized for the same keyword. What is interesting is that the internal page has very few backlinks compared to the home page but Google seems to favor it since the keyword is in the URL. I am afraid a 301 will drop us from the first page of the SERPs.
On-Page Optimization | | surveygizmo0 -
Duplicate Product BUT Unique Content -- any issues?
We have the situation where a group of products fit into 2 different categories and also serve different purposes (to the customer). Essentially, we want to have the same product duplicated on the site, but with unique content and it would even have a slightly different product name. Some specifications would be redundant, but the core content would be different. Any issues?
On-Page Optimization | | SEOPA1