Is there a way to make Google realize/detect scraper content?
-
Good morning,Theory states that duplicated content reduces certain keywords’ position in Google. It also says that a web who copy content will be penalized. Furthermore, we have spam report tools and the scraper report to inform against these bad practices.In my case: the website, both, sells content to other sites and write and prepare its own content which is not in sale. However, other sites copy these last ones, publish them and Google do not penalize their position in results (not in organic results neither in Google news), even though they are reported using Google tools for that purpose.Could someone explain this to me? Is there a way to make Google realize/detect these bad practices?Thanks
-
I've found backlinks in scraper websites linking to the scraped website I am taking care of.
They are in css, images, forms.
What's the point in doing it on their side?
-
Stolen content is a big issue today and recent reports have shown that people who steal the content from you will usually knock you out of your search engine position, no matter what your authority, backlink, or social share profiles look like.
This great presentation given by Jon Earnshaw at Brighton SEO last week gives a better idea of how it has affected other websites : http://www.slideshare.net/jonathanearnshaw/is-your-content-working-better-for-someone-else
Google use to have a Scraper report that you could file the offending site and get it removed from the SERPS but they have removed this.
I found a similar way to report the stolen content on this blog post :
http://www.techng.info/removing-your-stolen-content-from-google-search-using-dmca/
Hope this answers your question, even if it is a bit delayed from the original post
-
Hello,
The reporting tools are not particularly useful in this scenario as duplicate content is not a penalty-worthy situation. While Panda is used to destroy spam-oriented content, duplicate content is treated as more of a null/void situation than as a penalty.
For example, when you place your newly-created original content and it is crawled and indexed, Google attributes your domain with being the origin of said content. If another website showcases this content, it is recognized as duplicate by Google (which has compared it to your indexed version) and given no benefit or penalty. In effect, using duplicate content is merely a neutral practice - it's the spam that Google is really after.
Here's a beginner's report on duplicate content that spells it out quite nicely:
https://moz.com/learn/seo/duplicate-content
As Charles mentioned, copied content is not an automatic ban sentence. If it is within "acceptable limits" there is not a detrimental impact to the website. However, if the website is made up of purely copied content from multiple sources, and spams links or keyword stuffs, it will be dealt with accordingly.
In short, this website will not be penalized in the fashion you desire unless they are spamming or keyword stuffing (among other penalty-worthy offences). Your best bet is to beat them out by building up your link profile and continuing to post valuable, original content.
Let me know if there is anything else I can help with.
Rob
-
Theory states that duplicated content reduces certain keywords’ position in Google.
Wrong. Google might omit duplicate results or ban sites practising it, but it doesn't lower rankings based on number of duplicates or something. Otherwise wikipedia or any aggregating websites like car dealers etc would be nowhere to be found.
It also says that a web who copy content will be penalized.
Semi-wrong. It will be penalized if it's spammy and overdoing it.
Watch this video of Matt Cutts on duplicate content - https://www.youtube.com/watch?v=mQZY7EmjbMA
So, my understanding is that there is no 100% working way of getting down scrapers, because some of them are actually "good" scrapers. Like Facebook! - the biggest scraper in the world.
So, to beat them in rankings, just make sure that you are an authority in your industry, have awesome backlink profile and all aspects of SEO are properly implemented. And yes, sometimes those penalization tools can help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content or Backlinks
HI I have resource issues and need to prioritise my time, I know both content & backlinks are important for SEO, but where will it be most beneficial to spend my time? We are a generalist site, so this also makes things tougher. I have some core areas to work on, but want to be the most effective in the time I spend on them. Thanks!
Intermediate & Advanced SEO | | BeckyKey1 -
Does collapsing content impact Google SEO signals?
Recently I have been promoting custom long form content development for major brand clients. For UX reasons we collapse the content so only 2-3 sentences of the first paragraph are visible. However there is a "read more" link that expands the entire content piece.
Intermediate & Advanced SEO | | RosemaryB
I have believed that the searchbots would have no problem crawling, indexing and applying a positive SEO signal for this content. However I'm starting to wonder. Is there any evidence that the Google search algorithm could possible discount or even ignore collapsed content?1 -
About duplicate content
We have to products: - loan for a new car
Intermediate & Advanced SEO | | KBC
- load for a second hand car Except for title tag, meta desc and H1, the content is of course very similmar. Are these pages considered as duplicate content? https://new.kbc.be/product/lenen/voertuig/autolening-tweedehands-auto.html
https://new.kbc.be/product/lenen/voertuig/autolening-nieuwe-auto.html thanks for the advice,0 -
Whats the best way to implement rel = “next/prev” if we have filters?
Hi everyone, The filtered view results in paginated content and has different urls: example: https://modli.co/dresses.html?category=45&price=13%2C71&size=25 Look at what it says in search engine land: http://searchengineland.com/implementing-pagination-attributes-correctly-for-google-114970 Look at Advanced Techniques paragraph. do you agree? it seem like google will index the page multiple times for every filter variant. Thanks, Yehoshua
Intermediate & Advanced SEO | | Yehoshua0 -
Domain Authority... http://www.domain.com/ vs. http://domain.com vs. http://domain.com/
Hey Guys, Looking at Page Authority for my Site and ranking them in Decending Order, I see these 3 http://www.domain.com/ | Authority 62 http://domain.com | Authority 52 http://domain.com/ | Authority 52 Since the first one listed has the highest Authority, should I be using a 301 redirects on the lower ranking variations (which I understand how works) or should I be using rel="canonical" (which I don't really understand how it works) Also, if this is a problem that I should address, should we see a significant boost if fixed? Thanks ahead of time for anyone who can help a lost sailor who doesn't know how to sail and probably shouldn't have left shore in the first place. Cheers ZP!
Intermediate & Advanced SEO | | Mr_Snack0 -
Best Way to Incorporate FAQs into Every Page - Duplicate Content?
Hi Mozzers, We want to incorporate a 'Dictionary' of terms onto quite a few pages on our site, similar to an FAQ system. The 'Dictionary' has 285 terms in it, with about 1 sentence of content for each one (approximately 5,000 words total). The content is unique to our site and not keyword stuffed, but I am unsure what Google will think about us having all this shared content on these pages. I have a few ideas about how we can build this, but my higher-ups really want the entire dictionary on every page. Thoughts? Image of what we're thinking here - http://screencast.com/t/GkhOktwC4I Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
User generated content - manual warning from Google
Over the weekend our website received large amounts of spammy comments / user profiles on our forums. This has led to Google giving us a partial manual action until we clear things up. So far we have: Cleared up all the spam, banned the offending user accounts, and temporary enabled admin-approval for new sign ups. We are currently investigating upgrading the forum software to the latest version in order to make the forums less susceptible to this kind of attack. Could anyone let me know whether they think it is the right time for us to submit a reconsideration request to get the manual action removed? Will the temporary actions we have taken be enough to get the ban lifted, or should we wait until the forum software has been updated? I'd really appreciate any advice, especially if there is anyone here who has experienced this issue themselves 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0