Duplicate Content & www.3quarksdaily.com, why no penalty?
-
Does anyone have a theory as to why this site does not get hit with a DC penalty?
The site is great, and the information is good but I just cannot understand the reason that this site does not get hit with a duplicate content penalty as all articles are posted elsewhere.
Any theories would be greatly appreciated!
-
Thank you for taking the time to respond, and with such well thought out answer.
I suppose the original author would not be so bothered about 3 Quarks Daily as at least they link to & request readers to visit the original site for the full article, which is obviously more than The New Dawn Liberia Site.
Do you feel that creating such a site (3 Quarks Daily) as a readers resource of the best articles on a specific topic from across the web is a legitimate way to build a website (for personal pleasure not profit)? and what are your thoughts on copyright issues?
How would you feel if others re-posted your content in this way?
It is interesting that Google does not penalize duplicate content websites, and in this specific example surprising that those re-posting others content can rank higher.
(sorry for asking so many questions)
-
Hi Kevin,
before entering into your question, it is better to precise that duplicated content is not cause of penalty. We talk about it in "penalization" terms because Google tends to filter pages with duplicated content, if they are in the same site and because duplicated content waste the so called budget crawl. But when it comes to content duplicated in several sites, then we don't have a rule, even though the scraper update was meant to give an order to this kind of situation.
In the case of 3quarksdaily.com, you have to notice:
- it is a clearly stated curation content website (see http://www.3quarksdaily.com/3quarksdaily/aboutus.html )
- it references the original source correctly with an attribution link in the author name
The same could be said about http://www.thenewdawnliberia.com site, an online newspaper, that published too the same article here.
Personally, I don't think that this kind of content syndication has to be penalized.
But the most important thing to notice is that is the original source that doesn't rank first (it is 4th) for that same query! If i was its SEO I would start investigating why.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel canonical on other page instead of duplicate page. How Google responds?
Hi all, We have 3 pages for same topics. We decided to use rel canonical and remove old pages from search to avoid duplicate content. Out of these 3 pages....1 and 2 type of pages have more similar content where 3 type don't have. Generally we must use rel canonical between 1 and 2. But I am wondering what happens if I canonical between 1 and 3 while 2 has more similar content? Will Google respects it or penalise as we left the most similar page and used other page for canonical. Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Is Having Content 'Above The Fold' Still Relevant for Website Design and SEO
Hey there, So I have a client who recently 're-skinned' their website and now there is little to no content above the fold. Likewise, I've noticed that since the transition to this new front-end design there has been a drop in rankings for a number of keywords related to one of the topics we are targeting. Is there any correlation here? Is having content 'above the fold' still a relevant factor in determining a websites' searchability? I appreciate you reading and look forward to hearing from all of you. Have a great day!
Algorithm Updates | | maxcarnage0 -
Are all duplicate contents bad?
We were badly hit by Panda back in January 2012. Unfortunately, it is only now that we are trying to recover back. CASE 1:
Algorithm Updates | | Gautam.Jain
We develop software products. We send out 500-1000 word description about the product to various download sites so that they can add to their product listing. So there are several hundred download sites with same content. How does Google view this? Did Google penalize us due to this reason? CASE 2: In the above case the product description does not match with any content on our website. However, there are several software download sites that copy and paste the content from our website as the product description. So in this case, the duplicate content match with our website. How does Google view this? Did Google penalize us due to this reason? Along with all the download sites, there are also software piracy & crack sites that have the duplicate content. So, should I remove duplicate content only from the software piracy & crack sites or also from genuine download sites? Does Google reject all kind of duplicate content? Or it depends on who hosts the duplicate content? Confused 😞 Please help.0 -
Moving content in to tabs
Hi, I'm kind of an SEO noobie, so please bare with me 🙂 On one of the sites I'm working on I got a request to move large blocks of content, just placed on the page currently, in to tabs. This makes sense. We tried it and it makes navigating through the information much easier for visitors. My question is: Will Google consider this as hiding information? It's not loaded dynamically. It's all their when the page is loaded, in the source, but not displayed until the visitor clicks the tab. Will this cause SEO issues? Thank you!
Algorithm Updates | | eladlachmi0 -
What's better .NET or a hyphenated.COM domain
What's better .NET or a hyphenated .COM domain I know this is simple but in selecting a domain for my current project and I only have two options. firstname-lastname.COM or
Algorithm Updates | | RonSparks
firstnamelastname.NET I'm leaning to the .COM as after reading the how to choose a domain name post. http://www.seomoz.org/blog/how-to-choose-the-right-domain-name Thanks1 -
Does this mean that exact keyword phrase anchor text is not the dominating ranking factor anymore for serps?http://insidesearch.blogspot.com/2011/11/ten-recent-algorithm-changes.html
Does this mean that exact keyword phrase anchor text is not the dominating ranking factor anymore for serps? http://insidesearch.blogspot.com/2011/11/ten-recent-algorithm-changes.html If so what is the new most important factor?
Algorithm Updates | | AndrewSEO0 -
Is a slash just as good as buying a country specific domain? .com/de vs .de
I guess this question comes in a few parts: 1. Would Google read a 2-letter country code that is after the domain name (after the slash) and recognize it as a location (targeting that country)? Or does is just read it as it would a word. eg. www.marketing.com/de for a microsite for the Germans www.marketing.com/fr for a microsite for the French Or would it read the de and fr as words (not locations) in the url. In which case, would it have worse SEO (as people would tend to search "marketing france" not "marketing fr")? 2. Which is better for SEO and rankings? Separate country specific domains: www.marketing.de and www.marketing.fr OR the use of subfolders in the url: www.marketing.com/de and www.marketing.com/fr
Algorithm Updates | | richardstrange0