Masses (5,168 issues found) of Duplicate content.
-
Hi Mozzers,
I have a site that has returned 5,168 issues with duplicate content.
Where would you start?
I started sorting via High page Authority first the highest being 28 all the way down to 1. I did want to use the rel=canonical tag as the site has many redirects already.
The duplicates are caused by various category and cross category pages and search results such as ....page/1?show=2&sort=rand.
I was thinking of going down the lines of a URL rewrite and changing the search anyway. Is it work redirecting everything in terms of results versus the effort of changing all the 5,168 issues?
Thanks
sm
-
Hi Guys,
Thanks for the responses I'm going to have a look at the issue again, with your suggestions in mind. And I'll keep you posted. Thanks again.
-
Don't look at individual URLs - at the scale of 5K plus, look at your site architecture and what kind of variants you're creating. For example, if you know that the show= and sort= parameter are a possible issue, you could go to Google and enter something like:
site:example.com inurl:show=
(warning: it will return pages with the word "show" in the URL, like "example.com/show-times" - not usually an issue, but it can be on rare occasion).
That'll give you a sense of how many cases that one parameter is creating. Odds are, you'll find a couple that are causing 500+ of the 5K duplicates, so start with those.
Search pagination is very tricky - you could canonicalize to "View All" as Chris Hill said, you could NOINDEX pages 2+, or you could try Google's new (but very complicated way):
http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html
Problem is, that doesn't work on Bing and it's pretty easy to mess up.
The rel-canonical tag can scoop up sorts pretty well. You can also tell Google in Google Webmaster Tools what those parameters do, and whether to index them, but I've had mixed luck with that. If you're not having any serious problems, GWT is easy and worth a shot.
-
Have a look at your pagination too. If you've not got a 'show all' link it might be worth putting one in and making that the canonical. Should eliminate some of your duplicate content issues.
-
Last I came accross such an issue I mostly started with making the 'easy' changes that reduced the number the most.
In the last case, it was implimenting a 301 to the www version of the site (cutting the errors in half) and putting a canonical on one search page.
This got the number down to the point where it was easyer to make decisions on 'Is it worth making friendlyer urls' and discover more intresting places dup content was being generated.
It's one of these things I would always aim for 0 where I can. It usualy means that the url or site structure can be improved sugnificantly, or it's such an easy fix that it's hard to justify not doing.
-
If it really is a URL issue then you should just be able to easily canonical the root pages and the rest should sort itself out. Start there and let the next spidering tell you where you stand.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have duplicate content but // are causing them
I have 3 pages duplicated just by a / Example: https://intercallsystems.com/intercall-nurse-call-systems**//**
Technical SEO | | Renalynd
https://intercallsystems.com/intercall-nurse-call-systems**/** What would cause this?? And how would I fix it? Thanks! Rena0 -
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | | znotes0 -
Minimising the effects of duplicate content
Hello, We realised that one of our clients, copied a large part of content from our website to his. The normal reaction would be to send a cease and desist letter. Nevertheless this would probably mean loosing a good client. The client dumped the text of several articles (for example:
Technical SEO | | Lvet
http://www.velascolawyers.com/en/property-law/136-the-ley-de-costas-coastal-law.html ) Into the same page:
http://www.freundlinger-partners.com/en/home/faqs-property-law/ I convinced the client to place our authorship tags on this page, but I am wondering if this is enough. What do you think? Cheers
Luca0 -
404 and Duplicate Content.
I just submitted my first campaign. And it's coming up with a LOT of errors. Many of them I feel are out of my control as we use a CMS for RV dealerships. But I have a couple of questions. I got a 404 error and SEO Moz tells me the link, but won't tell me where that link originated from, so I don't know where to go to fix it. I also got a lot of duplicate content, and it seems a lot of them are coming from "tags" on my blog. Is that something I should be concerned about? I will have a lot more question probably as I'm new to using this tool Thanks for the responses! -Brandon here is my site: floridaoutdoorsrv.com I welcome any advice or input!
Technical SEO | | floridaoutdoorsrv0 -
Once duplicate content found, worth changing page or forget it?
Hi, the SEOmoz crawler has found over 1000 duplicate pages on my site. The majority are based on location and unfortunately I didn't have time to add much location based info. My question is, if Google has already discovered these, determined they are duplicate, and chosen the main ones to show on the SERPS, is it worth me updating all of them with localalized information so Google accepts the changes and maybe considers them different pages? Or do you think they'll always be considered duplicates now?
Technical SEO | | SpecialCase0 -
Noindex duplicate content penalty?
We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit. Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex the forum pages hopefully it will not damage the rest of the site. In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.
Technical SEO | | Grumpy_Carl0 -
Duplicate content across multiple domains
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain. My question relates to the removal of these pages. There are thousands of these duplicate pages. I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen? Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site. I suppose another option would be to include no cache meta tag for these pages. Any thoughts or comments would be appreciated.
Technical SEO | | jmsobe0 -
Duplicate content
This is just a quickie: On one of my campaigns in SEOmoz I have 151 duplicate page content issues! Ouch! On analysis the site in question has duplicated every URL with "en" e.g http://www.domainname.com/en/Fashion/Mulberry/SpringSummer-2010/ http://www.domainname.com/Fashion/Mulberry/SpringSummer-2010/ Personally my thoughts are that are rel = canonical will sort this issue, but before I ask our dev team to add this, and get various excuses why they can't I wanted to double check i am correct in my thinking? Thanks in advance for your time
Technical SEO | | Yozzer0