Does posting a source to the original content avoid duplicate content risk?
-
A site I work with allows registered user to post blog posts (longer articles).
Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content?
Thanks!
-
I don't know what Roger says, but I believe that followed links on noindex pages will pass PageRank, anchor text and other link benefits. Your instructions are to "no index" but the page will still be crawled.
-
Hi EGOL.
If you noindex pages and other sites link to them, do you benefit from that or not?
Do you see any pagerank on those, that are old enough to show it?
What does Roger say about those?
-
I publish other people's content. That caused a Panda problem about a year ago - which I was able to recover from by noindexing those pages. Now I noindex / follow any content that I publish that appears on another website.
The articles that I write are published on my own site only.
-
I'm concerned about what's best for my site -and would therefore not post other peoples content - so i've never had to deal with this
I guess if I owned both sites i would prefer to cross canonical the duped pages to my other site If i didn't own the other site i would probably just opt to noindex follow that page i guess
-
The last question in the text is......
Can rel="canonical" be used to suggest a canonical URL on a completely different domain?
There are situations where it's not easily possible to set up redirects. This could be the case when you need to migrate to a new domain name using a web server that cannot create server-side redirects. In this case, you can use the
rel="canonical"
link element to specify the exact URL of the domain preferred for indexing. While therel="canonical"
link element is seen as a hint and not an absolute directive, we do try to follow it where possible. -
Egol,
The Matt Cutts video seems to say you can't canonicalize between two totally different domains. So, we couldn't use a canonical for that.
-
Canonicalling them will give the benefit to the author's original page. It does not have benefit for you.
If you want them to rel=canonical for you then it is good to do it for them.
-
If you want to avoid panda with content on your own site then you can noindex, follow those pages.
Your visitors will be able to use them but they will not appear in the search engines.
-
Hey Egol, What is the benefit of canonicalling to them over just meta noindex,following the page?
-
So, you're not saying rel canonical to their page?
What if we just no-follow pages on our site that author originally published on their site? Right now we link to it as orginally published on ....
I'm trying to avoid a Panda penalty for non-unique blog posts reposted on our site.
-
I have used rel=canonical to reduce duplicate content risk. However, more important, the rel=canonical gives credit to the page where it points.
One problem with guest posting is that to reduce duplicate content risk and transfer credit to your own site, you must have the site owners cooperation.
Of course, you can get author credit by linking the post to your Google+ profile - if you think that has value.
-
Hi,
Thanks, Egol
So, on a page of ours where someone re-posts their blog post on our site, we'd add a canonical tag on our page to point to their original page? That would be a canonical tag between two different domains. I didn't think that was okay.
And, if we did that, we wouldn't be risking some kind of Panda duplicate content penalty?
Thanks!
-
"Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content?"
No. To prevent that you need to use the rel=canonical.
See Matt Cutts video here....
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
Should I delete older posts on my site that are lower quality?
Hey guys! Thanks in advance for thinking through this with me. You're appreciated! I have 350 pieces of Cornerstone Content that has been a large focus of mine over the last couple years. They're incredibly important to my business. That said, less experienced me did what I thought was best by hiring a freelance writer to create extra content to interlink them and add relevancy to the overall site. Looking back through everything, I am starting to realize that this extra content, which now makes up 1/3 my site, is at about 65%-70% quality AND only gets a total of about 250 visitors per month combined -- for all 384 articles. Rather than spending the next 9 months and investing in a higher quality content creator to revamp them, I am seeing the next best option to remove them. From a pros perspective, do you guys think removing these 384 lower quality articles is my best option and focusing my efforts on a better UX, faster site, and continual upgrading of the 350 pieces of Cornerstone Content? I'm honestly at a point where I am ready to cut my losses, admit my mistakes, and swear to publish nothing but gold moving forward. I'd love to hear how you would approach this situation! Thanks 🙂
White Hat / Black Hat SEO | | ryj0 -
Legit Editorial Placement vs Penalized Guest Posting
I'm planning to begin contributing to several different media outlets and blogs on the net, and hoping that I can get some decent placements for me and a few of my colleagues. Looking specifically at legit media outlets and corporate blogs with a structured and considered editorial process where we can contribute thought leadership pieces. In light of all of the Google algorithm changes surrounding guest blogging, I am curious if this would be viewed as legit editorial placements, or as guest posts that would either carry no weight or be penalized? Secondly, what are the considerations and value of including a high quality in-article link back to our site vs. a byline link, or both. Does anyone have any data or experience with this? Thanks in advance! Andrew and wondering if anyone has any experience or insights
White Hat / Black Hat SEO | | Alaniz1 -
Re-Post: Unanswered - Loss of rankings due to hack. No manual penalty. Please advise.
Sorry for reposting, but i must have accidentally marked this as answered. I am still seeking advice/solutions. I have a client who's site was hacked. The hack added a fake directory to the site, and generated thousands of links to a page that no longer exists. We fixed the hack and the site is fully protected. We disavowed all the malicious/fake links, but the rankings fell off a cliff (they lost top 50 Google rankings for most of their targeted terms). There is no manual penalty set, but it has been 6 weeks and their rankings have not returned. In webmaster tools, their priority #1 "Not found" page is the fake page that no longer exists. Is there anything else we can do? We are out of answers and the rankings haven't even come back at all. Any advise would be helpful. Thanks!
White Hat / Black Hat SEO | | digitalimpulse0 -
International web site - duplicate content?
I am looking at a site offering different language options via a javascript drop down chooser. Will google flag this as duplicate content? Should I recommend the purchase of individual domains for each country? i.e. .uk
White Hat / Black Hat SEO | | bakergraphix_yahoo.com1 -
Moving content to a clean URL
Greetings My site was seriously punished in the recent penguin update. I foolishly got some bad out sourced spammy links built and I am now paying for it 😞 I am now thinking it best to start fresh on a new url, but I am wondering if I can use the content from the flagged site on the new url. Would this be flagged as duplicate content, even if i took the old site down? your help is greatly appreciated Silas
White Hat / Black Hat SEO | | Silasrose0 -
Duplicate Content due to Panda update!
I can see that a lot of you are worrying about this new Panda update just as I am! I have such a headache trying to figure this one out, can any of you help me? I have thousands of pages that are "duplicate content" which I just can't for the life of me see how... take these two for example: http://www.eteach.com/Employer.aspx?EmpNo=18753 http://www.eteach.com/Employer.aspx?EmpNo=31241 My campaign crawler is telling me these are duplicate content pages because of the same title (which that I can see) and because of the content (which I can't see). Can anyone see how Google is interpreting these two pages as duplicate content?? Stupid Panda!
White Hat / Black Hat SEO | | Eteach_Marketing0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0