Lots of websites copied my original content from my own website, what should I do?
-
1. Should I ask them to remove and replace the content with their unique and original content?
2. Should I ask them to link to the URL where the original content is located?
3. Should I use a tool to easily track these "copycat" sites and automatically add links from their site to my site?
Thanks in advance!
-
Hi,
-
Yes you can contact the owner to ask for removal. It might or might not be worth your time.
-
You can do that too. Or actually the best way to do this proactively in the future, if you are using wordpress, is using RSS footer option or manually set this up that will display the original full link of the article, this way it is automatically there as the scrape your RSS feed. This would be my top pro-active suggestion.
-
You can do this manually with Google Webmaster Tools and/or Copyscape, or also you can set up Google Alerts for each most that you have done that will notify you when the content is scraped.
For detailed info how to do some of this see this article also make sure to read the comments they are informative as well: http://blog.kissmetrics.com/content-scrapers/
As far as actively fighting this, many say, it is not worth it, as long as you take precautionary steps to make sure that you always outrank the scrapers, with methods such as the RSS footers for each article you publish.
Hope this helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Got Google Manual penalty full Spam on my website
Here are Moz Metrics: http://prntscr.com/as3fp6 Site Url: www.financialprospect.com DA- 40 PA- 48 Spam Score - 0 RD- 68 Links No Loss in Backlink Profile I think my site is having much more spun content so can you suggest me the ways to re-index my site? How can i get my site back to google? Can you suggest any tool which give number of links already spun and then we may delete those posts. Looking for positive reply...!!!
White Hat / Black Hat SEO | | morisshibu1 -
HELP! My website has been penalized - what did I do wrong?
I have been working on a website Zing.co.nz and have made a sub domain blog.zing.co.nz. The website is for a company that is yet to launch, so I have been boosting traffic by writing blog posts about the topic (loans) on the subdomain. I pushed some traffic to the actual website too. We were climbing the rankings for our brand name but have all of a sudden started to drop. The domain authority was something like 0.9 and has dropped to 0.3. (Using SEO Spyglass) The blog was somewhere similar, but has dropped to 0.0!!! Please help in anyway you can. These changes have happened within the last 48 hours. Zing.co.nz Blog.zing.co.nz
White Hat / Black Hat SEO | | Startupfactory0 -
Top authors for ecommerce content
Hello, What are some tips that you recommend for someone looking to hire an expert to write or consult in a piece of content. It's as general a keyword as our niche has and it's the only keyword that's actually inside the niche that has any decent level of backlinks. We're considering searching out an expert in our field that knows more about the subject than our people do even though our people are knowledgable. Trying to come from authority. Your recommendations in the process of coming up with a great piece of content from a good authority?
White Hat / Black Hat SEO | | BobGW0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Can you disavow a spamy link that is not pointing to your website?
We have submitted several really spammy websites to the Google spam team. We noticed they take a very long time to react to submissions. Do you know if it is possible to disavow a link that is not pointing to your website but rather to a very spammy website? Thanks
White Hat / Black Hat SEO | | Carla_Dawson0 -
Is Meta Keywords Important For Websites?
Hi, I understand that meta title and descriptions are very important for websites. I would like to know if meta keywords are important? I have seen people talking about meta keywords are useless and it should be removed from the website to prevent competitors from knowing your keywords. Anyone has anything to share? 🙂
White Hat / Black Hat SEO | | chanel270 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0