Wordpress Duplicate Content Due To Allocating Two Post Categories
-
It looks like google has done a pretty deep crawl of my site and is now showing around 40 duplicate content issues for posts that I have tagged in two seperate categories for example:
http://www.musicliveuk.com/latest-news/live-music-boosts-australian-economy
http://www.musicliveuk.com/live-music/live-music-boosts-australian-economy
I use the all in one SEO pack and have checked the no index for categories, archive, and tag archive boxes so google shouldn't even crawl this content should it?
. I guess the obvious answer is to only put each post in one category but I shouldn't have to should I? Some posts are relevant in more than once category.
-
I have just gone to make sure that each post just has one category and add redirects and I've noticed that all the duplicate title issues google has notified me about appear to redirect anyway. For example:
http://www.musicliveuk.com/latest-news/who-are-the-most-expensive-wedding-bands
and
http://www.musicliveuk.com/music-news/who-are-the-most-expensive-wedding-bands
have duplicate titles apparantly but the 1st url redirects to the 2nd one. I use the redirection plug in but have no redirection set up for that url so I'm a bit confused. And if they're redirecting anyway then why is google flagging up duplicate titles?
-
Apparently there is a 301 redirect which forwards the /latest-news/ version of the link to the /live-music URL. As long as that redirect is in place there is no duplicate content issue.
If you do wish to maintain a listing in multiple categories, a canonical tag pointing from the duplicate page to the main URL would resolve the issue for search engines.
-
Hi Samuel,
As I see it, you've got a couple of options: a) only post in one category and use tags to drive navigation around to other places where you want your content to be, or b) change your permalink structure to not include the category in the URL, which is probably not ideal. Either way, you're looking at a few 301 redirects to sort the problems you've currently got out - the redirection plugin for WordPress is probably your best bet here.
Noindexing the pages as you have will not mean that the pages within the sub categories don't appear in the SERPs - that is site.com/cat/sub-cat/article and site.com/cat/another-sub-cat/article will appear and be dupe content - only that the category pages themselves will not be indexed, which is probably not ideal either if you want to have a targeted page around live music for example.
So, to fixing. Make your choice of one of the two options above. Personally I would go for a), then I would look at removing the noindex on category and tag pages. I would also look to migrate from AiO SEO to Yoast's SEO for WordPress.
Hope this helps
James
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New blog post URLs due to WordPress permalink structure changes. Any SEO repercussions?
A client site had the follwing URLs for all blog posts: www.example.com/health-news/sample-post www.example.com/health-news is the top level page for the blog section. While making some theme changes during Google mobilegeddon, the permalink structure got changed to www.example.com/sample-post ("health-news" got dropped from all blog post URLs). Google has indexed the updated post structure and older URLs are getting redirected (if entered directly in the browser) to the new ones; it appears that WordPress takes care of that automatically as no 301 redirects were entered manually. It seems that there hasn't been any loss of rankings (however not 100% sure as the site ranks for well over 100 terms). Do you suggest changing the structure back to the old one? Two reasons that I see are preserving any link juice from domains linking to old URLs and ensuring no future/current loss of rankings.
Intermediate & Advanced SEO | | VishalRayMalik0 -
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
Questions Regarding Wordpress Blog Format, Categories and Tag pages...
I'm looking to make some optimizations to a website I'm working on but wanted more input before I get started: Currently, when blogs are posted to the website, the URL for each post looks like this:www.mywebsite.com/blogpost I've heard that for whatever reason, the best practice is to make sure that the blog posts get posted to a blog sub-directory like so: www.mywebsite.com/blog/blogpost If I were to make this change, I'm assuming I would have to 301 redirect all of the existing blogs to their new locations. Is this change worth doing and are there any other considerations I should be taking into account? Also, I'm aware that there are certain schools of thought that category and tag pages should be no-indexed to avoid duplicate content issues. Can anyone shed some light on this from first hand experience? Thanks in advance!
Intermediate & Advanced SEO | | goldbergweismancairo0 -
Best practice with duplicate content. Cd
Our website has recently been updated, now it seems that all of our products pages look like this cdnorigin.companyname.com/catagory/product Google is showing these pages within the search. rather then companyname.com/catagory/product Each product page does have a canaonacal tag on that points to the cdnorigin page. Is this best practice? i dont think that cdnorigin.companyname etc looks very goon in the search. is there any reason why my designer would set the canonical tags up this way?
Intermediate & Advanced SEO | | Alexogilvie0 -
I have search result pages that are completely different showing up as duplicate content.
I have numerous instances of this same issue in our Crawl Report. We have pages showing up on the report as duplicate content - they are product search result pages for completely different cruise products showing up as duplicate content. Here's an example of 2 pages that appear as duplicate : http://www.shopforcruises.com/carnival+cruise+lines/carnival+glory/2013-09-01/2013-09-30 http://www.shopforcruises.com/royal+caribbean+international/liberty+of+the+seas We've used Html 5 semantic markup to properly identify our Navigation <nav>, our search widget as an <aside>(it has a large amount of page code associated with it). We're using different meta descriptions, different title tags, even microformatting is done on these pages so our rich data shows up in google search. (rich snippet example - http://www.google.com/#hl=en&output=search&sclient=psy-ab&q=http:%2F%2Fwww.shopforcruises.com%2Froyal%2Bcaribbean%2Binternational%2Fliberty%2Bof%2Bthe%2Bseas&oq=http:%2F%2Fwww.shopforcruises.com%2Froyal%2Bcaribbean%2Binternational%2Fliberty%2Bof%2Bthe%2Bseas&gs_l=hp.3...1102.1102.0.1601.1.1.0.0.0.0.142.142.0j1.1.0...0.0...1c.1.7.psy-ab.gvI6vhnx8fk&pbx=1&bav=on.2,or.r_qf.&bvm=bv.44442042,d.eWU&fp=a03ba540ff93b9f5&biw=1680&bih=925 ) How is this distinctly different content showing as duplicate? Is SeoMoz's site crawl flawed (or just limited) and it's not understanding that my pages are not dupe? Copyscape does not identify these pages as dupe. Should we take these crawl results more seriously than copyscape? What action do you suggest we take? </aside> </nav>
Intermediate & Advanced SEO | | JMFieldMarketing0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Duplicate content that looks unique
OK, bit of an odd one. The SEOmoz crawler has flagged the following pages up as duplicate content. Does anyone have any idea what's going on? http://www.gear-zone.co.uk/blog/november-2011/gear$9zone-guide-to-winter-insulation http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone http://www.gear-zone.co.uk/blog/july-2011/telephone-issues-$9-2nd-july-2011 http://www.gear-zone.co.uk/blog/september-2011/gear$9zone-guide-to-nordic-walking-poles http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone https://www.google.com/webmasters/tools/googlebot-fetch?hl=en&siteUrl=http://www.gear-zone.co.uk/
Intermediate & Advanced SEO | | neooptic0 -
Duplicate Content Help
seomoz tool gives me back duplicate content on both these URL's http://www.mydomain.com/football-teams/ http://www.mydomain.com/football-teams/index.php I want to use http://www.mydomain.com/football-teams/ as this just look nice & clean. What would be best practice to fix this issue? Kind Regards Eddie
Intermediate & Advanced SEO | | Paul780