ROI on Policing Scraped Content
-
Over the years, tons of original content from my website (written by me) has been scraped by 200-300 external sites. I've been using Copyscape to identify the offenders. It is EXTREMELY time consuming to identify the site owners, prepare an email with supporting evidence (screen shots), and following up 2, 3, 15 times until they remove the scraped content. Filing DMCA takedowns are a final option for sites hosted in the US, but quite a few of the offenders are in China, India, Nigeria, and other places not subject to DMCA. Sometimes, when a site owner takes down scraped content, it reappears a few months or years later. It's exasperating.
My site already performs well in the SERPs - I'm not aware of a third party site's scraped content outperforming my site for any search phrase.
Given my circumstances, how much effort do you think I should continue to put into policing scraped content?
-
I watch my traffic increases and decreases. You can do that with google analytics. I do it with clicky. When I see an important page show traffic losses, I go looking.
One of my retail sites suddenly was not selling a certain product category very well. I looked into it and hundreds of "made in China" blogs had scraped my content.
Then, I have images that are often grabbed. I watch image search traffic and watch for them.
I have tens of thousands of pages on the web. Its hard to monitor all of them, but it is easy to monitor when you can download a traffic spreadsheet that has % up and % down, sort it and then investigate. So, I am being responsive instead of proactive. And, really, I don't look at it as ROI, it is loss prevention.
-
Thanks for the detailed suggestions!
As a follow up: what metric do you use to decide which offenders to go after, and which ones to ignore? I simply don't have time to go after everybody who has copied my content so I need a way to prioritize.
There are two obvious situations where action is warranted: first, when the infringement is committed by a competitor in my industry, and second, when the infringing content outperforms my own site in the SERPs. What else would you suggest?
Thanks again.
-
Over the years, tons of original content from my website (written by me) has been scraped by 200-300 external sites.
I have the same problem on multiple sites. Most of the time the scraping is not harmful. But, on several occasions it has cost me thousands of dollars and forced me to abandon product lines and donate thousands of dollars worth of inventory to Goodwill. Infringers have included websites of many law firms, a state supreme court. a presidential candidate, an Ivy League law school and many others. Infringers can be using images, video or text.
It is EXTREMELY time consuming to identify the site owners, prepare an email with supporting evidence (screen shots), and following up 2, 3, 15 times until they remove the scraped content. Filing DMCA takedowns are a final option for sites hosted in the US,....
I am not an expert in intellectual property law, so what I do or say is not advice. Filing a DMCA can get you sued even if you are in the right. If you file a DMCA all of the details including your name and why you filed will be easily available to the person or company that you complained about. They can retaliate against you, call begging you to retract the DMCA, they can do anything they want against you.
If I contact someone two or three times without results I go straight to DMCA. One thing that I can say about Google is that they generally respond promptly about removing infringing content from their web SERPs and image SERPs. They also generally respond promptly to infringing content on Blogspot and YouTube. Ebay will shut down auctions en masse in response to a DMCA if a seller or group of sellers are using your images or other property.
When infringing content is on a university, government agency, or prominent company's website they usually respond immediately to notification. I usually contact a provost, legal department, or internal manager instead of writing to "webmaster" - who probably was involved in the problem and simply does not understand intellectual property. I usually don't prepare a big document. An email pointing out the infringing work and offering a resolution of "take it down right away" will usually get fast results.
quite a few of the offenders are in China, India, Nigeria, and other places not subject to DMCA.
If you can't identify the owner of the website or if they are outside of the USA, you can still file a DMCA to have the content removed from search engines or websites like YouTube or Blogspot who have an international user community but are owned by a US company. Some of them will insist that you deal with their infringing member, having an attorney contact them might yield quick results.
A lot of the professional spam is done from outside of the USA but there are a few spammers and simply arrogant cowboys in the USA. DMCA is the route to take, but you do risk retaliation with some of them.
Sometimes, when a site owner takes down scraped content, it reappears a few months or years later. It's exasperating.
Yep.
I spend a good amount of time protecting my content. The problem is so big that I can usually only afford to do it in situations where the scraping, infringing or whatever is costing me or my content is appearing on the website of an established business or organization who should have people in leadership positions who would not want that happening.
I watch my analytics watching for traffic drops, etc. Occasionally I go out looking for infringement. The cost of policing can be astronomical. I could have a full time employee working on this if I was going after everyone - and its not cost effective. Most of the people who are grabbing your stuff are putting it on domains that can't damage your rankings.
A greater problem than verbatim theft, in my opinion, is the people who grab your articles and simply rewrite them. You spent tons of time doing the research and preparing the presentation. They simply do a paragraph-by-paragraph rewrite into something that is not detectable or recognizable beyond structure.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Minimum amount of content for Ecommerce pages?
Hi Guys, Currently optimizing my e-commerce store which currently has around 100 words of content on average for each category page. Based on this study by Backlinko the more content the better: http://backlinko.com/wp-content/uploads/2016/01/02_Content-Total-Word-Count_line.png Would you say this is true for e-commerce pages, for example, a page like this: http://www.theiconic.com.au/yoga-pants/ What benefits would you receive with adding more content? Is it basically more content, leads to more potential long-tail opportunity and more organic traffic? Assuming the content is solid and not built just for SEO reasons. Cheers.
Intermediate & Advanced SEO | | seowork2140 -
Internal Duplicate Content - Classifieds (Panda)
I've been wondering for a while now, how Google treats internal duplicate content within classified sites. It's quite a big issue, with customers creating their ads twice.. I'd guess to avoid the price of renewing, or perhaps to put themselves back to the top of the results. Out of 10,000 pages crawled and tested, 250 (2.5%) were duplicate adverts. Similarly, in terms of the search results pages, where the site structure allows the same advert(s) to appear under several unique URLs. A prime example would be in this example. Notice, on this page we have already filtered down to 1 result, but the left hand side filters all return that same 1 advert. Using tools like Siteliner and Moz Analytics just highlights these as urgent high priority issues, but I've always been sceptical. On a large scale, would this count as Panda food in your opinion, or does Google understand the nature of classifieds is different, and treat it as such? Appreciate thoughts. Thanks.
Intermediate & Advanced SEO | | Sayers1 -
Duplicate Content Question
Currently, we manage a site that generates content from a database based on user search criteria such as location or type of business. ..Although we currently rank well -- we created the website based on providing value to the visitor with options for viewing the content - we are concerned about duplicate content issues and if they would apply. For example, the listing that is pulled up for the user upon one search could have the same content as another search but in a different order. Similar to hotels who offer room booking by room type or by rate. Would this dynamically generated content count as duplicate content? The site has done well, but don't want to risk a any future Google penalties caused by duplicate content. Thanks for your help!
Intermediate & Advanced SEO | | CompucastWeb1 -
.co.uk domain with romanian content
Hello, I want to build a blog on a .co.uk domain that targets Romanians from uk. The blog will be in Romanian. What recommendation do you have regarding Google, should I submit the website to google.co.uk or google.ro. Did you had any cases like this? Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Ajax Content Indexed
I used the following guide to implement the endless scroll https://developers.google.com/webmasters/ajax-crawling/docs/getting-started crawlers and correctly reads all URLs the command "site:" show me all indexed Url with #!key=value I want it to be indexed only the first URL, for the other Urls I would be scanned but not indexed like if there were the robots meta tag "noindex, follow" how I can do?
Intermediate & Advanced SEO | | wwmind1 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0 -
HTTPS Duplicate Content?
I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?
Intermediate & Advanced SEO | | QuickLearnTraining0 -
Should I do something about this duplicate content? If so, what?
On our real estate site we have our office listings displayed. The listings are generated from a scraping script that I wrote. As such, all of our listings have the exact same description snippet as every other agent in our office. The rest of the page consists of site-wide sidebars and a contact form. The title of the page is the address of the house and so is the H1 tag. Manually changing the descriptions is not an option. Do you think it would help to have some randomly generated stuff on the page such as "similar listings"? Any other ideas? Thanks!
Intermediate & Advanced SEO | | MarieHaynes0