Duplicate Terms of Use and Privacy Policy, is it a problem?
-
Hi,
If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner?
Regards
-
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
-
You can do almost anything you wish on a "noindex" tagged page. You are telling the search engine bot to exclude the page from the search index, so the page should not affect your ranking.
The reason your site's # of pages is a factor, is your overall site is viewed as a whole. If you have a basic site with 10 pages, and there is a problem with 1 of the pages having duplicate content, then 10% of your site is affected, and this can impact how the search engine views your site. If your site hosted a forums with 10k pages, then that 1 page would represent 0.001 of your site, so the impact would not have any real effect.
-
Thanks for the helpful reply Alan! Can you please explain this - "If it's only a few pages, sure, duplicate content there could have an impact". How duplicate content issues vary between small and big sites? I was under the impression that number of pages do not have any influence in duplicate content.
Is it okay to use the same privacy policy and terms of use across different websites as long as i noindex,follow them?
-
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
-
Thanks for the reply Ryan! But if i don't block it via robots.txt file or noindex tag, will it affect my site negatively? I mean the overall site
-
I would recommend blocking pages such as privacy policy, terms of use, legal, etc. It is unlikely these pages would ever bring traffic to your site. Even if they did, it is not going to be the quality traffic you desire.
In robots.txt you can add
Disallow: /pages/privacy/
substitute your local path for /pages/privacy/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content - Joint Press Releases
One of my clients posts Joint Press Releases with it's partners on their blog. The client's partner posts the exact same press release on their website. I think this must meet the Duplicate Content flag... does anyone have experience in dealing with this? Sarita
Content Development | | sarita-2201190 -
Content Curation & Duplicate Content
Hi, I have a client that wants to do content curation but it has been my understanding that adding external content that is already live on another website to your website, you get penalized for duplicate content. I have read that you can create an excerpt and then Google won't penalizes you for duplicate content. Can anyone shed more light on this topic. Thanks
Content Development | | M_80 -
Typepad.com blog migration & duplicate content
I've migrated a typepad.com blog with a bunch of content (but little traffic) onto a hosted WordPress site under my own domain name (the way I should've done it in the first place). Now I don't want to confuse Google that the new site is duplicating content from the other site, so would I be better off with: 1) meta-refresh redirecting each typepad.com post to the same post on the new blog, or 2) just killing the typepad.com blog entirely so Google will not find duplicate posts anywhere. In favor of #2 is the fact that these posts get very little traffic today. I figure I will lose more traffic from duplicate content ranking penalties than from losing the posts themselves in the original blog. What do you think?
Content Development | | chriscrabtree0 -
Does anyone use rss feeds to attract more visitors
Hi, i am wondering if anyone uses rss feeds to bring visitors to their site. My site is www.in2town.co.uk and i am not sure if it would be worthwhile in using rss feeds to bring visitors to my site. I must admit i have never subscribed to any rss feeds so this has got me thinking, do many people subscribe to them these days.
Content Development | | ClaireH-1848860 -
Duplicate Content
I am wondering what is the best way to show google that there is duplicate content on the page. for example on our product pages they are unique content except we give the same guarantee and promise on every product providing some duplicate content. What is the best way to fix this issue?
Content Development | | DoRM0 -
How can I rank using translated content?
My friend has a website with similar content to mine, in a different language however. He has allowed me to translate his content if I link to it every post (can be nofollow). Does Google penalize me for clearly translated content? How can I make sure it ranks well? BTW, if I convince him that I don't link to him, is it better SEO-wise? Best,
Content Development | | kikocherman
Cherman0 -
Resolving duplicate text issues with a duplicate image?
We are a listing site for programs overseas. Many of our listings are inherently the same content, because in many cases the same exact information applies. We have resolved duplicate content issues to some extent by making some of the content in these listings unique. However, for the rest of the content which is going to be the same for about 100 pages, we were wondering if its better to have an image in place instead of duplicate text content (this would basically be an image of the text in question). We know this is a problem, because this is inherently duplicate content as well (only its a duplicate image instead of duplicate text). However, what's the best solution to this problem, and is a duplicate image just asking for trouble, or might this actually be a good idea?
Content Development | | dunklea0 -
Duplicate Content Penalty
If our pages are to have roughly 30% of non-original textual content, can we be penalized by Google? Or are we OK as long as this non-original content is relevant to the pages?
Content Development | | Quidsi0