Implications of posting duplicate blog content on external domains?
-
I've had a few questions around the blog content on our site. Some of our vendors and partners have expressed interest in posting some of that content on their domains. What are the implications if we were to post copies of our blog posts on other domains? Should this be avoided or are there circumstances that this type of program would make sense?
-
**I'd like to think not ALL duplicate content is bad. **
I agree. However, more often than not, it is bad. It's best to approach each situation with caution.
As long as your site has been crawled and that page is already in the SERPs, and the new website puts a link to the original article on your website, I don't think there should be any issue.
I don't feel that confident in this statement. In theory, yes the original article would get the attribution. However, in practice, that can not be the case. If the site posting the duplicate content has higher authority and traffic, they may get the attribution even though you published the content first. Google does a decent job in figuring this out, but it's an area they need to drastically improve upon.
I personally would re-work a new article or ask them to summarize your article, to be on the extra safe side. But that's just me.
Summaries are a good idea. Still, you can't beat new unique content. Use the article for inspiration and create a new article that takes a point the original article made even further. Definitely include a link back to the original article, if possible. This strategy would give you a better gain the simply posting or summarizing the original article with a link back.
-
I'd like to think not ALL duplicate content is bad.
For example, in your situation the other party is posting content from your site that you already published.
As long as your site has been crawled and that page is already in the SERPs, and the new website puts a link to the original article on your website, I don't think there should be any issue.
If I go around and copy/paste other people's articles on my blog, are those people going to be penalized? Highly doubtful. If anything my website will be penalized because it's full of a bunch of duplicate content I just copy/pasted word-for-word.
I personally would re-work a new article or ask them to summarize your article, to be on the extra safe side. But that's just me.
-
- Your putting your website at risk for being penalized by Google (Penguin!). Anytime duplicate content is associated with your domain, it is bad.
- Google may not understand that you're the original content creator and give authority to another domain
- You're increasing competition in the SERPs for your article's terms
My biggest concern is the penalty and devaluing your entire site authority.
-
Thanks for the response Ray.
What would the negative implications be on re-purposing exact content on other domains? Obviously we decrease the value of the original posting driving organic traffic to our site, but will this have considerable impact on our page ranking and other SEO metrics?
-
Hi Visier,
A cost-benefit analysis needs to be done with each individual external domain, to see if it's worth allowing them to publish your blog content.
At first, I would say no. Keep your content on your site and request that people do not simply duplicate it on their domain. This can hurt both of your SERPs and doesn't provide much value back to you.
However, certain situations may be a good idea. For example, the external domain sends you a lot of customers. I would suggest the following:
-
Talk with each website about the content they would like to republish
-
Suggest using your article as inspiration for a new article on their site, maybe even work with them to create a new unique piece of content
-
Be sure to have your article mentioned in the new article, preferably with a natural content link back to your site/article
-
If they insist on republishing the article then use a cross-domain canonical tag to indicate that your article is the original piece of content and deserves the authority / reduce the possibility of a dup content penalty.
In most situations, I would advise against republishing content exactly on another domain.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content created by website Calendar - A Penalty?
A colleague of mine asked me a question about duplicate content coming from their event calendar. I don't think this will affect them negatively, but I would love some feedback and thoughts. ThanksOne of my clients, LifeTech Academy, is using my RavenTools software. Raventools has reported a HUGE amount of duplicate content (4.4K instances).The duplicate content all revolves around their calendar and repeating events (http://lifetechacademy.org/events/)The question is this - will this impact their SEO efforts in a negative way?
Intermediate & Advanced SEO | | Bill_K0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Ticket Industry E-commerce Duplicate Content Question
Hey everyone, How goes it? I've got a bunch of duplicate content issues flagged in my Moz report and I can't figure out why. We're a ticketing site and the pages that are causing the duplicate content are for events that we no longer offer tickets to, but that we will eventually offer tickets to again. Check these examples out: http://www.charged.fm/mlb-all-star-game-tickets http://www.charged.fm/fiba-world-championship-tickets I realize the content is thin and that these pages basically the same, but I understood that since the Title tags are different that they shouldn't appear to the Goog as duplicate content. Could anyone offer me some insight or solutions to this? Should they be noindexed while the events aren't active? Thanks
Intermediate & Advanced SEO | | keL.A.xT.o1 -
New my domain.com/blog option vs. my blog.mydomain.com option
Our e-commerce site has been on Big Commerce for about a year now. One thing many SEO folks had told us is that having a blog located at /blog was going to help more than a subdomain blog. option. BC has never had the option to have a blog hosted on their platform (/blog) until now. I am now wondering, since we have lost traffic in the past and are trying everything we can to regain it, if we should purchase the Wordpress Site Redirect upgrade and move the subdomain blog (blog.) to the new site option /blog. Any help or feedback from you is very much appreciated. I have attached a screenshot of our main website vs. our blog from Open Site Explorer in case it helps anything. I29Tw5P
Intermediate & Advanced SEO | | josh3300 -
All Thin Content removed and duplicate content replaced. But still no success?
Good morning, Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk. Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS. Can anyone tell me why we aren't making any progress or spot something we are not doing correctly? Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500). Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Having a hard time with duplicate page content
I'm having a hard time redirecting website.com/ to website.com The crawl report shows both versions as duplicate content. Here is my htaccess: RewriteEngine On
Intermediate & Advanced SEO | | cgman
RewriteBase /
#Rewrite bare to www
RewriteCond %{HTTP_HOST} ^mywebsite.com
RewriteRule ^(([^/]+/)*)index.php$ http://www.mywebsite.com/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L] I added the last 2 lines after seeing a Q&A here, but I don't think it has helped.0 -
What constitutes duplicate content?
I have a website that lists various events. There is one particular event at a local swimming pool that occurs every few months -- for example, once in December 2011 and again in March 2012. It will probably happen again sometime in the future too. Each event has its own 'event' page, which includes a description of the event and other details. In the example above the only thing that changes is the date of the event, which is in an H2 tag. I'm getting this as an error in SEO Moz Pro as duplicate content. I could combine these pages, since the vast majority of the content is duplicate, but this will be a lot of work. Any suggestions on a strategy for handling this problem?
Intermediate & Advanced SEO | | ChatterBlock0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0