Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Terms of Use and Privacy Policy, is it a problem?
-
Hi,
If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner?
Regards
-
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
-
You can do almost anything you wish on a "noindex" tagged page. You are telling the search engine bot to exclude the page from the search index, so the page should not affect your ranking.
The reason your site's # of pages is a factor, is your overall site is viewed as a whole. If you have a basic site with 10 pages, and there is a problem with 1 of the pages having duplicate content, then 10% of your site is affected, and this can impact how the search engine views your site. If your site hosted a forums with 10k pages, then that 1 page would represent 0.001 of your site, so the impact would not have any real effect.
-
Thanks for the helpful reply Alan! Can you please explain this - "If it's only a few pages, sure, duplicate content there could have an impact". How duplicate content issues vary between small and big sites? I was under the impression that number of pages do not have any influence in duplicate content.
Is it okay to use the same privacy policy and terms of use across different websites as long as i noindex,follow them?
-
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
-
Thanks for the reply Ryan! But if i don't block it via robots.txt file or noindex tag, will it affect my site negatively? I mean the overall site
-
I would recommend blocking pages such as privacy policy, terms of use, legal, etc. It is unlikely these pages would ever bring traffic to your site. Even if they did, it is not going to be the quality traffic you desire.
In robots.txt you can add
Disallow: /pages/privacy/
substitute your local path for /pages/privacy/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to use some content that I sent out in a newsletter and post as a blog, but will this count as duplicate content?
I want to use some content that I sent out in a newsletter a while ago - adding it as a blog to my website. The newsletter exists on a http://myemail.constantcontact.com URL and is being indexed by Google. Will this count as duplicate content?
Content Development | | Wagada0 -
Using images from Wikipedia, The right way to give credit for them?
Hi, I am writing my first blog post and I have downloaded 9 images from Wikipedia, After reading the confusing legal stuff, I am under the impression that all those images are allowed to be used for other purposes with citing/ giving credit to the owner. At least the ones I download the author said its ok to use them anywhere, for anything So how do i do that? Should I have something like this: Image, underneath-- image credit to xyz (link to Wikipedia page where i downloaded the image) ??? Thanks for you time and explanation.
Content Development | | Davit19850 -
Duplicate content between my web and Youtube
I have a web with 90 vídeos hosted in my Youtube channel and each video have an description in single url, but Youtube videos have no description. If I copy description from my web and paste to Youtube description, this is duplicate content for Google ?
Content Development | | VisualService1 -
Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs. USA: http://www.bannerbuzz.com/blog/ UK: http://www.bannerbuzz.co.uk/blog/ AUS: http://www.bannerbuzz.com.au/blog/ CA: http://www.bannerbuzz.ca/blog/ Let me give you very clear ideas on it. Recently, We have published one article on USA website. http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/ And, We want to publish this article / blog on UK, AUS & CA blog without making any changes. I have read following paragraph on Google's official guidelines and It's inspire me to make it happen. Which is best solution for it? Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Content Development | | CommercePundit0 -
Duplicate Content behind a Paywall
We have a website that is publicly visible. This website has content. We'd like to take that same content, put it on another website, behind a paywall. Since Google will not be able to crawl those pages behind the paywall is there any risk to ua doing this? Thanks! Mike
Content Development | | FOTF_DigitalMarketing0 -
Author Rank - Using the brand as the author
Hi i'm building a new site and want to start building up author rank right from the start. If you are building author rank for a brand, do you think its fine to use the brand as the actual author of the content, instead of a actual person? Or using a stage name rather then a persons actual name, and have your writers write under that particular stage name? Would love to hear peoples opinions. Cheers, Mark
Content Development | | monster990 -
How can I rank using translated content?
My friend has a website with similar content to mine, in a different language however. He has allowed me to translate his content if I link to it every post (can be nofollow). Does Google penalize me for clearly translated content? How can I make sure it ranks well? BTW, if I convince him that I don't link to him, is it better SEO-wise? Best,
Content Development | | kikocherman
Cherman0 -
Wordpress Duplicate Pages/ URL's - Help !
Hi guys, I have been running SEOMoz for just over a month and slowly cleaning up one of my Wordpress Blogs. While going through the crawl reports I have noticed that I have duplicate pages showing on the crawl. For example, the main post would be; www.xxxxx.com/blog/post-title Then I see another URL which would be; **www.xxxx.com/blog/page/59 ** When I click on either URL it goes back to the actual post title URL. What's with these page URL's ? Isn't these two URL's showing duplicate content to the search engines ? Any suggestions would be greatly appreciated.
Content Development | | dcc0