How to publish duplicate content legitimately without Panda problems
-
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists.
Your visitors love these articles and columns but the search engines see them as duplicate content.
You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty.
So, you decide to continue publishing the content and use...
<meta name="robots" content="noindex, follow">
This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them.
I have two questions.....
-
If you use "noindex" will that be enough to prevent your site from being considered as a content farm?
-
Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
-
-
Good idea about attributing with rel=canonical.
Thanks!
-
Noindexing the syndicated articles should, in theory, minimize the likelihood of having a Panda problem, but it seems like Panda is constantly evolving. You will probably see some kind of drop in rankings as the number of indexed pages of for site will decrease. If you have say, 1000 pages total on the site and suddenly 900 are taken out of the index, this might be a problem. If it is a much smaller percentage of the site, you might not have a problem at all. Other than the number of indexed pages, I don't think you will have a problem once the syndicated stuff is noindexed.
It will probably take Google a while to re-index/un-index the pages, so hopefully it won't be a fast drop if there is one. In the long run, it is probably better to at least have the appearance of trying to do the right thing. Linking to the source, and maybe using rel=canonical tags to the original article would also be a good practice. -
Thank you, Nick.
We will be using the "noindex" only on the pages with syndicated content. This is a DreamWeaver site and it is easy to place the code on specific pages and does not use excerpts.
Do you still see a potential problem?
The question really is... "Could a site that contains a lot of syndicated content have a Panda problem if the pages that contain that content are noindexed?"
-
I am assuming you intend to use no index only on the duplicate content articles. Using no index on everything would also prevent your content from being indexed and found through Google.
If you are using Wordpress or something else that will allow showing excerpts, you could try making the article pages noindex and show only excerpts on the main page and category pages which would be indexed and followed. I think that would make the articles not appear in searches and avoid duplicate content penalties, while allowing the pages that show the excerpts to still be indexed and rank OK.
The idea here is that the pages showing the excerpts would have enough text to help the home and category pages to rank for the subject matter and hopefully not be seen as what it is - copied content.
You will probably eventually get caught by the Panda, but this may work as a temporary solution until you can get some original content mixed in.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content : domain alias issue
Hello there ! Let's say my client has 2 webshops (that exists since long time, so many backlinks & good authority on both) : individuals.nl : for individuals (has 200 backlinks, let's say) pros.nl : exact same products, exact same content, but with a different branding intended to professionnals (has 100 backlinks, let's say) So, both websites are 99% identical and it has to remain like that !!! Obviously, this creates duplicate content issues. Goal : I want "individuals.nl" to get all ranking value (while "pros.nl" should remain accessible through direct access & appear on it's own brand queries). Solution ? Implement canonical tags on "pros**.nl**" that goes to "individuals.nl". That way, "individuals.nl" will get all ranking value, while "pros.nl" will still be reachable through direct access. However, "individuals.nl" will then replace "pros.nl" from SERP in the long-term. The only thing I want is to keep "pros.nl" visible for its own brand queries -> it won't be possible through organic search result, so, I'm just gonna buy those "pros" queries through paid search ! Put links on all pages of pros.nl to individuals.nl (but not the other way around), so that "pros.nl" will pass some ranking value to "individuals.nl" (but only a small part of the ranking value -> ideally, I would like to pass all link value to this domain). Could someone advise me ??? (I know it sound a bit complicated... but I don't have much choice ^^)
Technical SEO | | Netsociety0 -
Duplicate content on report
Hi, I just had my Moz Campaign scan 10K pages out of which 2K were duplicate content and URL's are http://www.Somesite.com/modal/register?destination=question%2F37201 http://www.Somesite.com/modal/register?destination=question%2F37490 And the title for all 2K is "Register" How can i deal with this as all my pages have the register link and login and when done it comes back to the same page where we left and that it actually not duplicate but we need to deal with it propely thanks
Technical SEO | | mtthompsons0 -
How to solve Parameter Issue causing Duplicate Content
Hi everyone, My site home page comes up in SERP with following url www.sitename/?referer=indiagrid My question is:- Should I disallow using robots.txt.? or 301 redirect to the home page Other issue is i have few dynamic generated URL's for a form http://www.www.sitename/career-form.php?position=SEO Executive I am using parameter "position" in URL Parameter in GWT. But still my pages are indexed that is leading to duplicate page content. Please help me out.
Technical SEO | | himanshu3019890 -
Issue: Duplicate Page Content
Hi All, I am getting warnings about duplicate page content. The pages are normally 'tag' pages. I have some blog posts tagged with multiple 'tags'. Does it really affect my site?. I am using wordpress and Yoast SEO plugin. Thanks
Technical SEO | | KLLC0 -
Ways of Helping Reducing Duplicate Content.
Hi I am looking to no of anyway there is at helping to reduce duplicate content on a website with out breaking link and affecting Google rankings.
Technical SEO | | Feily0 -
Canonical usage and duplicate content
Hi We have a lot of pages about areas like ie. "Mallorca" (domain.com/Spain/Mallorca), with tabbed pages like "excursion" (domain.com/spain/Mallorca/excursions) and "car rental" (domain.com/Spain/Mallorca/car-rental) etc. The text on ie the "car rental"-page is very similar on Mallorca and Rhodos, and seomoz marks these as duplicate content. This happens on "car rental", "map", "weather" etc. which not have a lot of text but images and google maps inserted. Could i use rel=nex/prev/canonical to gather the information from the tabbed pages? That could show google that the Rhodos-map page is related to Rhodos and not Mallorca. Is that all wrong or/and is there a better way to do this? Thanks, Alsvik
Technical SEO | | alsvik0 -
If two websites pull the same content from the same source in a CMS, does it count as duplicate content?
I have a client who wants to publish the same information about a hotel (summary, bullet list of amenities, roughly 200 words + images) to two different websites that they own. One is their main company website where the goal is booking, the other is a special program where that hotel is featured as an option for booking under this special promotion. Both websites are pulling the same content file from a centralized CMS, but they are different domains. My question is two fold: • To a search engine does this count as duplicate content? • If it does, is there a way to configure the publishing of this content to avoid SEO penalties (such as a feed of content to the microsite, etc.) or should the content be written uniquely from one site to the next? Any help you can offer would be greatly appreciated.
Technical SEO | | HeadwatersContent0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240