Are all duplicate contents bad?
-
We were badly hit by Panda back in January 2012. Unfortunately, it is only now that we are trying to recover back.
CASE 1:
We develop software products. We send out 500-1000 word description about the product to various download sites so that they can add to their product listing. So there are several hundred download sites with same content.How does Google view this? Did Google penalize us due to this reason?
CASE 2:
In the above case the product description does not match with any content on our website. However, there are several software download sites that copy and paste the content from our website as the product description. So in this case, the duplicate content match with our website.
How does Google view this? Did Google penalize us due to this reason?
Along with all the download sites, there are also software piracy & crack sites that have the duplicate content.
So, should I remove duplicate content only from the software piracy & crack sites or also from genuine download sites?
Does Google reject all kind of duplicate content? Or it depends on who hosts the duplicate content?
Confused Please help.
-
It is tricky. As Michael said it is important to get your content indexed first, which can help identify you as the source. Google doesn't always do a great job of that. Generally, I don't worry too much about Case 1, but in your case, it can be tougher. The problem is that many download sites can have very high authority and could start outranking you for these product descriptions. If that happens, it's unlikely you'd be penalized, but you could be filtered out or knocked down the rankings, which might feel like a penalty.
Here's the thing, with Case 1, though. If these download sites are simply outranking you, but you're distributing product, is it so awful? I think you have to look at the trade-off through the lens of your broader business goals.
Case 2 is tougher, since there's not a lot you can do about it, short of DMCA takedowns. You've got to hope Google sorts it out. Again, getting in front of it and getting your content in the index quickly is critical.
If you were hit by Panda, I'd take a hard look at anything on your own site that could be harming you. Are you spinning out variations of your own content? Are you creating potentially duplicate URLs? Are you indexing a ton of paginated content (internal searches, for example). You may find that the external duplicates are only part of your Panda problem - if you can clean up what you control, you'll be much better off. I have an extensive duplicate content write-up here:
-
For all new content it is important to get indexed fast. There is the scenario that if your site is crawled infrequently another site may get that copy indexed first and by default is viewed as theirs. So with any new content I would post on social media as quickly as possible - G+, Twitter etc to get noticed and to mark as yours. G+ author attribute will help.
-
Hi Gautam,
Good questions, it really hard to say what Google determines as duplicate content so this will just be my hunch on your issue. As I have experienced Google won't 'penalize' you as you're the owner of the content and you can't be the victim of other people stealing or copying your content. Also if you have provided these sites with your content. Mostly because you're often not in charge of the content management on somebodies elses site.
Hope this helps a bit!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Google webmaster tool content keywords Top URLs
GWT->Optimization->Content Keywords section... If we click on the keyword it will further shows the variants and Top URLs containing those variants. My problem is none of the important pages like product details pages, homepage or category pages are present in that Top URLs list. All the news, guides section url's are listed in the Top URLs section for most important keyword that is also present in my domain name. How to make google realize the important pages for the important keyword?
Algorithm Updates | | BipSum0 -
Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
Hello, I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings. Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions? Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings? Thank you for your help!
Algorithm Updates | | airnwater0 -
Is a New Visits ratio of 39% a really bad thing?
I do a lot of work for a large estate agency based almost solely in London. They get a considerable amount of traffc and all other stats, on the whole, are always positive. The only thing that is decreasing regularly is the percentage of new traffic. My understanding of user behaviour for this market is that no one in their right mind would make an enquiry or arrange a booking without a) looking at the property at least twice themeselves (once to before the enquiry and once before a viewing) and b) more than likely show a partner. Plus the site is well laid out and useful so I believe users are favouring our site over the comparison sites. So questions: Should I be panicing What is the most efficent way of increasing new visits? Things to note: The HTML titles throughout the site are a bit of a mess - key word rich but too long and inconsistent. Could this be a contributing factor to the CTR? Also in the past month we appeared in over 4k different queries but our non branded impressions are down 22%. Could more concise, less keyword stuffed HTML titles help this? Do I need to look at the page titles to ensure that they contain the exact phrases that are in decline? Any help will be greatly appreciated!
Algorithm Updates | | SoundinTheory0 -
Redesign, new content, new domain and 301 redirects (penalty?)
We merged our old webshops into one big project. After a few days we received our rankings back and traffic was coming in. Then suddenly we lost almost all rankings overnight. We did not use any wrong seo techniques and have unique content, written by our own writers. Is this a penalty or do we have to wait longer?
Algorithm Updates | | snorkel0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
Moving content in to tabs
Hi, I'm kind of an SEO noobie, so please bare with me 🙂 On one of the sites I'm working on I got a request to move large blocks of content, just placed on the page currently, in to tabs. This makes sense. We tried it and it makes navigating through the information much easier for visitors. My question is: Will Google consider this as hiding information? It's not loaded dynamically. It's all their when the page is loaded, in the source, but not displayed until the visitor clicks the tab. Will this cause SEO issues? Thank you!
Algorithm Updates | | eladlachmi0 -
High bounce rates from content articles influencing our rankings for rest of site
We have a large content article section on our e-commerce site that receives a lot of visits but also have very high bounce rates. We are wondering if this is hurting the rest of our site's rankings. **When I say bounce rates I mean what ever metrics Google is using to determine quality content (specifically after the Panda update). ** We are trying to determine if having the content articles on our domain hurts us. We only have the content articles for link building.
Algorithm Updates | | seozachz0