Are all duplicate contents bad?
-
We were badly hit by Panda back in January 2012. Unfortunately, it is only now that we are trying to recover back.
CASE 1:
We develop software products. We send out 500-1000 word description about the product to various download sites so that they can add to their product listing. So there are several hundred download sites with same content.How does Google view this? Did Google penalize us due to this reason?
CASE 2:
In the above case the product description does not match with any content on our website. However, there are several software download sites that copy and paste the content from our website as the product description. So in this case, the duplicate content match with our website.
How does Google view this? Did Google penalize us due to this reason?
Along with all the download sites, there are also software piracy & crack sites that have the duplicate content.
So, should I remove duplicate content only from the software piracy & crack sites or also from genuine download sites?
Does Google reject all kind of duplicate content? Or it depends on who hosts the duplicate content?
Confused Please help.
-
It is tricky. As Michael said it is important to get your content indexed first, which can help identify you as the source. Google doesn't always do a great job of that. Generally, I don't worry too much about Case 1, but in your case, it can be tougher. The problem is that many download sites can have very high authority and could start outranking you for these product descriptions. If that happens, it's unlikely you'd be penalized, but you could be filtered out or knocked down the rankings, which might feel like a penalty.
Here's the thing, with Case 1, though. If these download sites are simply outranking you, but you're distributing product, is it so awful? I think you have to look at the trade-off through the lens of your broader business goals.
Case 2 is tougher, since there's not a lot you can do about it, short of DMCA takedowns. You've got to hope Google sorts it out. Again, getting in front of it and getting your content in the index quickly is critical.
If you were hit by Panda, I'd take a hard look at anything on your own site that could be harming you. Are you spinning out variations of your own content? Are you creating potentially duplicate URLs? Are you indexing a ton of paginated content (internal searches, for example). You may find that the external duplicates are only part of your Panda problem - if you can clean up what you control, you'll be much better off. I have an extensive duplicate content write-up here:
-
For all new content it is important to get indexed fast. There is the scenario that if your site is crawled infrequently another site may get that copy indexed first and by default is viewed as theirs. So with any new content I would post on social media as quickly as possible - G+, Twitter etc to get noticed and to mark as yours. G+ author attribute will help.
-
Hi Gautam,
Good questions, it really hard to say what Google determines as duplicate content so this will just be my hunch on your issue. As I have experienced Google won't 'penalize' you as you're the owner of the content and you can't be the victim of other people stealing or copying your content. Also if you have provided these sites with your content. Mostly because you're often not in charge of the content management on somebodies elses site.
Hope this helps a bit!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonicals from sub-domain to main domain: How much content relevancy matters? Any back-links impact?
Hi Moz community, I have this different scenario of using canonicals to solve the duplicate content issue in our site. Our subdomain and main domain have similar landing pages of same topics with content relevancy about 50% to 70%. Both pages will be in SERP and confusing users; possibly search engine too. We would like solve this by using canonicals on subdomain pointing to main domain pages. Even our intention is to only to show main domain pages in SERP. I wonder how Google handles it? Will the canonicals will be respected with this content relevancy? What happens if they don't respect? Just ignore or penalise for trying to do this? Thanks
Algorithm Updates | | vtmoz0 -
Rel canonical on other page instead of duplicate page. How Google responds?
Hi all, We have 3 pages for same topics. We decided to use rel canonical and remove old pages from search to avoid duplicate content. Out of these 3 pages....1 and 2 type of pages have more similar content where 3 type don't have. Generally we must use rel canonical between 1 and 2. But I am wondering what happens if I canonical between 1 and 3 while 2 has more similar content? Will Google respects it or penalise as we left the most similar page and used other page for canonical. Thanks
Algorithm Updates | | vtmoz0 -
How to hide our duplicate pages from SERP? Best practice to increase visibility to new pages?
Hi all, We have total 4 pages about same topic and similar keywords. These pages are from our main domain and sub domains too. As the pages from sub domains are years old and been receiving visits from SERP, they stick to 1st position. But we have recently created new pages on our main domain which we are expecting to rank on 1st position. I am planning to hide the sub domain pages from SERP using "Remove URLs" for some days to increase visibility to new pages from main domain. Is this the right and best practice to proceed with? Thanks
Algorithm Updates | | vtmoz0 -
Duplicate website got indexed: Caused rank drop?
Hi all, We have replica of our website with exact pages and content. That website got indexed by mistake and allowed for bots for more than 10 days. Our ranking dropped now and we moved from 2nd page to 5th page. But previously we had this happened and didn't hurt much. We got punished now? Thanks
Algorithm Updates | | vtmoz0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Ranking For Synonyms Without Creating Duplicate Content.
We have 2 keywords that are synonyms we really need to rank for as they are pretty much interchangeable terms. We will refer to the terms as Synonym A and Synonym B. Our site ranks very well for Synonym A but not for Synonym B. Both of these terms carry the same meaning, but the search results are very different. We actively optimize for Synonym A because it has the higher search volume of the 2 terms. We had hoped that Synonym B would get similar rankings due to the fact that the terms are so similar, but that did not pan out for us. We have lots of content that uses Synonym A predominantly and some that uses Synonym B. We know that good content around Synonym B would help, but we fear that it may be seen as duplicate if we create a piece that’s “Top 10 Synonym B” because we already have that piece for Synonym A. We also don’t want to make too many changes to our existing content in fear we may lose our great ranking for Synonym A. Has anyone run into this issue before, or does anyone have any ideas of things we can do to increase our position for Synonym B?
Algorithm Updates | | Fuel0 -
How important is fresh content?
Lets say the website you are working on has covered most of the important topics on your subject. How important is it that you continue to add content to it when there really may not be much that is so relevant to your users anymore? Can a site continue to rank well if nothing new is added to the site for year but continues to get good quality links?
Algorithm Updates | | DemiGR0 -
Bad IP Neighborhood Question
I'm interested, weather bad network neighborhood could cause some penalties in Google indexing and search? For checking your site neighbors follow this URL (enter your site URL in the end): http://www.google.com/safebrowsing/diagnostic?site=domain.com
Algorithm Updates | | bubliki0