Website content has been scraped - recommended action
-
So whilst searching for link opportunities, I found a website that has scraped content from one of our websites. The website looks pretty low quality and doesn't link back. What would be the recommended course of action?
-
Email them and ask for a link back. I've got a feeling this might not be the best idea. The website does not have much authority (yet) and a link might look a bit dodgy considering the duplicate content
-
Ask them to remove the content. It is duplicate content and could hurt our website.
-
Do nothing. I don't think our website will get penalised for it since it was here first and is in the better quality website. Possibly report them to google for scraping?
What do you guys think?
-
-
It's good to be aware of the scrapers to see what they are trying to do with your content, and it can't hurt to ask them to remove it.
Don't ask for a link, you never want links for sites that rely on bad practices like that, it can hurt you.
This is most likely not effect you if left alone. If the scraper is grabbing from source code, then implementing a canonical tag in your content will help Google know where the content came from (but they probably already know).
-
Most of the time, contacting them is a waste of time. Being a weasel is their business model. Weasels usually have hidden domain registration data so finding their contact information is really hard.
If they have republished my content on blogspot, youtube, facebook or other community sites, I simply file a DMCA and the content is usually taken down quickly.
I don't want duplicates of my content on the web, especially not on powerful sites. Powerful sites are generally more responsible than Joe Schmoe working in his basement. Often just an email to them with "copyright infringement on yourdamndomain.com will get your content taken down. I've called people on the phone to tell them that they have my stuff on their site and that is faster than filling out forms. Be nice, not threatening and they usually comply if you get them on the phone.
I don't ask for links because I don't want weasels linking to me.
-
Was your site's scraped content already indexed in Google?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Duplicate content issues arise 6 months after creation of website?!
Hi, I've had the same website for 6 months and fixed all original on-site issues a long time ago.Now this week I wake up and found 3 new errors: 3 of my pages have missing titles issues, missing meta description issues and also the Moz crawl says they all have duplicate content issues. All my rankings went down a lot as well. This site is static, doesn't even have a blog, everything is rel canonical and non-indexed. It's 100% original content as well. So how can those issues arise 6 months later? All my titles and descriptions are there and non-duplicate, and the content is original and not duplicate as well. Is this a wordpress bug or virus? Anyone had this happen to them and how to fix it? Thanks a lot for you help! -Marc
Technical SEO | | marcandre0 -
Mirrored content/ images
We are currently in the process of creating a new website in place of our old site (same URL etc.) We've recently created another website which has the same design/ layout/ pictures and general site architecture as our new site will have. If I was to add alt test to images only on one site would we still be penalised by Google as the sites 'look' the same, event thought they will have completely different URL's and different focusses on a similar topic. Content will be different also, but both sites will focus on a similar subject. Thanks
Technical SEO | | onlinechester0 -
Duplicate Footer Content
A client I just took over is having some duplicate content issues. At the top of each page he has about 200 words of unique content. Below this is are three big tables of text that talks about his services, history, etc. This table is pulled into the middle of every page using php. So, he has the exact same three big table of text across every page. What should I do to eliminate the dup content. I thought about removing the script then just rewriting the table of text on every page... Is there a better solution? Any ideas would be greatly appreciated. Thanks!
Technical SEO | | BigStereo0 -
Displaying static content - risky?
In an attempt to improve the speed of our site, We have installed Cache_Lite extension for PHP. Its a PEAR based system which converts dynamic pages into static pages. The system is set to delete the temp files every 15 minutes, at which point any changes / new content will appear on the system. I don't see any risk of doing this but thought it safe to double check if there could be any impact on Google by doing it?
Technical SEO | | ukss19840 -
How do you measure content on a website?
I never thought of this question before. Maybe because i didn't focus myself on content but only on optimizing existing content from clients. So how do you measure the content on a specific page?
Technical SEO | | mosaicpro0 -
Duplicate Content
We have a main sales page and then we have a country specific sales page for about 250 countries. The country specific pages are identical to the main sales page, with the small addition of a country flag and the country name in the h1. I have added a rel canonical tag to all country pages to send the link juice and authority to the main page, because they would be all competing for rankings. I was wondering if having the 250+ indexed pages of duplicate content will effect the ranking of the main page even though they have rel canonical tag. We get some traffic to country pages, but not as much as the main page, but im worried that if we remove those pages and redirect all to main page that we will loose 250 plus indexed pages where we can get traffic through for odd country specific terms. eg searching for uk mobile phone brings up the country specific page instead of main sales page even though the uk sales pages is not optimized for uk terms other than having a flag and the country name in the h1. Any advice?
Technical SEO | | -Al-0