Duplicating an article I wrote on an external blog
-
Hi, I wrote a blog article on another site. I would like to add the article to my site as well and would like to know the best way to do it.
If I duplicate the article that I wrote would I then risk getting a penalty for duplicate content?
If so, then what is the best way for me to include the article on my site for the benefit of my readers, but not lead to the duplicate content problem?
Would it be better to use a canonical tag? Or to noindex the page?
If I use the canonical tag, am I helping to make the article on the external blog stronger? Where is I use the noindex tag I am not helping my site nor that article I think, is that right?
Last question, if I offer the copy of the article on my site and use the canonical or noindex tag then my site does not receive any direct benefit from the article for SEO. In other words the article wont appear in the search index with a link to my site. What about the comments that people write on the article on my site? That is unique content which may have great questions or points. I want to ensure those can be indexed properly. If I noindex the page I lose out. If I canonicalize (is that a word?) the page then I don't know if will send search results based on those comments to the external blog where that information (the comments from my site) does not exist.
Thank you for any help to better understand this part of seo.
-
I like the rel author option in this case. It doesnt really take care of the indexing issues, but I lean toward not worrying about it. In some cases, i let Google figure out which one they want to index given the two. They will probably choose the original posting, but if you get comments and discussion, I can see that bubbling to the top. Its more like news sites or aggregators at that point.
-
Ah, yes, if you use rel canonical on your blog then the whole page will be nonindexed. I'm wondering if perhaps rel author is what you need here? But I haven't quite figured out enough about that yet!
-
Thank you all for the replies.
@Dunamis, my concern is that if I use the canonical tag for the article then how would a search engine understand the canonical represents the article and not the comments. There can be great discussion within the comments. If the search engine canonalizes the who page and sends users to the target URL then they will send traffic to that site for comments which do not exist on that site. Or if they discount my page all together then the page wont get indexed even though there are some good comments and discussion which otherwise should be indexed.
@Ryan, thank you for reminding me about having a link in the article. That is something I otherwise forgot about but will do in the future.
@Theo, if I had an article on my site which is canonicalized to a URL on another site, and then someone links to the page on my site, do I get credit for the link? I would think the link credit goes through to the canonicalized URL would it not?
-
This sounds like exactly the situation that the canonical tag was created for. Make the tag point to the article that you want indexed.
Or, another option, if you want both to be indexed is to create a second version of the article with different wording.
-
If its just for your users, and its helpful, go ahead and just post the article. Its technically duplicate content, but google has already determined that the article site had the original content up first, and yours may or may not get indexed ever. But you should care.
If you are looking at the SEO implications, thats a whole different reason. I hope you have a link in that article to your site, since you published it. If so, you would actually benefit more from the link value from the other site. If you boost the value of that blog article on the other site, and it has a link to you, that would hold more SEO value down the road then trying to figure out how to get around the dupliate content issue here.
-
"If I duplicate the article that I wrote would I then risk getting a penalty for duplicate content?"
Not likely a penalty, but no benefits either (unless people start linking to the version on your site of course)
"If so, then what is the best way for me to include the article on my site for the benefit of my readers, but not lead to the duplicate content problem?
Would it be better to use a canonical tag? Or to noindex the page?"
Both would work I think, though canonical would be the neater option (assuming it isn't harming you to help the other website).
"If I use the canonical tag, am I helping to make the article on the external blog stronger? Where is I use the noindex tag I am not helping my site nor that article I think, is that right?"
Right and right
"What about the comments that people write on the article on my site?"
I think (this is the toughest one) you're getting the visitors that search for phrases in your comments (Google can't send those visitors to the other site as it doesn't contain the particular phrases) with the cross-domain canonical solution, as with the noindex solution nobody gets these visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate currency page variations?
Hi guys, I have duplicate category pages across a ecommerce site. http://s30.postimg.org/dk9avaij5/screenshot_160.jpg For the currency based pages i was wondering would it be best (or easier) to exclude them in the robots.txt or use a rel canonical? If using the robots.txt (would be much easier to implement then rel canonical) to exclude the currency versions from being indexed what would the correct exclusion be? Would it look something like: Disallow: */?currency/ Google is indexing the currency based pages also: http://s4.postimg.org/hjgggq1tp/screenshot_161.jpg Cheers,
Intermediate & Advanced SEO | | jayoliverwright
Chris0 -
How to structure articles on a website.
Hi All, Key to a successful website is quality content - so the Gods of Google tell me. Embrace your audience with quality feature rich articles on your products or services, hints and tips, how to, etc. So you build your article page with all the correct criteria; Long Tail Keyword or phrases hitting the URL, heading, 1st sentance, etc. My question is this
Intermediate & Advanced SEO | | Mark_Ch
Let's say you have 30 articles, where would you place the 30 articles for SEO purposes and user experiences. My thought are:
1] on the home page create a column with a clear heading "Useful articles" and populate the column with links to all 30 articles.
or
2] throughout your website create link references to the articles as part of natural information flow.
or
3] Create a banner or impact logo on the all pages to entice your audience to click and land on dedicated "articles page" Thanks Mark0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
To "Guest Blog" or "Ghost Blog"?
To "Guest Blog" or "Ghost Blog"? I've been wondering which would be better given G's "authorship" tracking program. "Onreact.Com" indirectly raised this issue in a recent blog post "Google Authorship Markup Disadvantages Everybody Ignores" as : "Google might dismiss your guest articles. Your great guest blogging campaign on dozens of other blogs might fail because Google will count the links all as one as the same author has written all the posts and linked to himself. So maybe the links won't count at all." Assuming all other things are equal, would you use "Guest Author" with G Authorship attribution (if allowed) or just ghost the article and include an in-text link without attribution to you as the author?
Intermediate & Advanced SEO | | JustDucky1 -
Duplicate content that looks unique
OK, bit of an odd one. The SEOmoz crawler has flagged the following pages up as duplicate content. Does anyone have any idea what's going on? http://www.gear-zone.co.uk/blog/november-2011/gear$9zone-guide-to-winter-insulation http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone http://www.gear-zone.co.uk/blog/july-2011/telephone-issues-$9-2nd-july-2011 http://www.gear-zone.co.uk/blog/september-2011/gear$9zone-guide-to-nordic-walking-poles http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone https://www.google.com/webmasters/tools/googlebot-fetch?hl=en&siteUrl=http://www.gear-zone.co.uk/
Intermediate & Advanced SEO | | neooptic0 -
Blog Subscribers Count Tool
So, I have about 100 different blogs that I am starting to organize into a list of potential guest blog opportunities. I wanted to see how many subscribers each blog had to their site to better help identify the top influential blogs. The only way I know how to do this is to search for the site inside of Google reader. I was wondering if anybody knows of a tool or knows a way to scrape the blog subscriber count from Google docs, or something more scalable. Thanks, Jason
Intermediate & Advanced SEO | | Jason_3420 -
Duplicate content on sub-domains?
I have 2 subdamains intented for 2 different countries (Colombia and Venezuela) ve.domain.com and co.domain.com. The site it's an e-commerce with over a million products available so they have the same page with the same content on both sub-domains....the only differences are the prices a payment options. Does google take that as duplicate content? Thanks
Intermediate & Advanced SEO | | daniel.alvarez0 -
Duplication Issue?
One of our copywriters has just written a blog to be posted on our own company blog to be reviewed by myself, however I had noticed that the blog post has some duplication issues with one of our own product pages, about 60% duplication, is it still worth posting? Will search engines still index the blog post? Kind Regards,
Intermediate & Advanced SEO | | Paul780