Duplicate Terms of Use and Privacy Policy, is it a problem?
-
Hi,
If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner?
Regards
-
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
-
You can do almost anything you wish on a "noindex" tagged page. You are telling the search engine bot to exclude the page from the search index, so the page should not affect your ranking.
The reason your site's # of pages is a factor, is your overall site is viewed as a whole. If you have a basic site with 10 pages, and there is a problem with 1 of the pages having duplicate content, then 10% of your site is affected, and this can impact how the search engine views your site. If your site hosted a forums with 10k pages, then that 1 page would represent 0.001 of your site, so the impact would not have any real effect.
-
Thanks for the helpful reply Alan! Can you please explain this - "If it's only a few pages, sure, duplicate content there could have an impact". How duplicate content issues vary between small and big sites? I was under the impression that number of pages do not have any influence in duplicate content.
Is it okay to use the same privacy policy and terms of use across different websites as long as i noindex,follow them?
-
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
-
Thanks for the reply Ryan! But if i don't block it via robots.txt file or noindex tag, will it affect my site negatively? I mean the overall site
-
I would recommend blocking pages such as privacy policy, terms of use, legal, etc. It is unlikely these pages would ever bring traffic to your site. Even if they did, it is not going to be the quality traffic you desire.
In robots.txt you can add
Disallow: /pages/privacy/
substitute your local path for /pages/privacy/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I have a service based client that is interested in optimizing his website for all the services that he provides in all the locations that he provides them in. For example: Service 1, location 1 Service 1, location 2 Service 2, location 1 Service 2, location 2 He wants to essentially create an individual page for each of the above, but i'm concerned that he will be penalized for duplicate content. Each of the pages would have the keyword in the url, page title and within the main body of content. We would certainly alter the content somewhat, but not sure how much a difference this would make. Any thoughts or advice would be greatly appreciated.
Content Development | | embracedarrenhughes1 -
Modifying Content to Avoid Duplicate Content Issues
We are planning to leverage specific posts from a US-based blog for our own Canadian blog (with permission, of course) but are aware that this can cause duplicate content issues. We're willing to re-write as much or as little as we must from the initial blog posts to avoid duplicate content issues but I have no idea just how much we will need to re-write. Is there some guideline for this (e.g., 25% of content must be re-written)? I've been unable to find anything. Thank you in advance!
Content Development | | QueenSt0 -
Duplicate content problem
Hi, i have a serious problem. I work in joomla and sometimes it can be annoying. When you set up a category, you need to give it a name and maybe this is a huge error on my part as i did not really think about the names beforehand. The situation i have now is, all my sections are in front page mode, but because you have to name the categories in order to write articles, i am now left with a load of blog sections such as http://www.in2town.co.uk/benidorm/benidorm-news Now i have a main section called Benidorm news so i have duplicate sections, i want to know if i can redirect the http://www.in2town.co.uk/benidorm/benidorm-news to go to the main benidorm section or if there is a better way of doing it. i have left this blod layout the way it is to show you, but the others i just have it where it shows the title and then goes to the article. I work in k2 and would be grateful if anyone can let me know the solution to this as semoz is showing that i have many duplicate titles and content many thanks
Content Development | | ClaireH-1848860 -
Is framed content on another domain duplicate content?
I've read a number of articles and am getting opposing answers. I've been checking pages of Photo.net in copyscape for duplicate content. I'm finding a number of domains have the site iframed onto them. I was wondering why copyscape could read the content if the search engines supposedly didn't crawl iframes. Copyscape said Google can read the content. I just want to know if these sites need to remove the iframe (is it hurting Photo.net)? Thanks. Examples: http://www.copyscape.com/?q=http%3A%2F%2Fphoto.net%2Freviews%2F
Content Development | | cakelady0 -
Merge pages - use redirects?
I have merged the content of three HTML pages to one. pages one URL stays as is. the URLs of the pages 2 and 3 are obsolete. Would you recommend to use 301 redirects from the obsolete URLs to page 1? Other proposals? Thanks, Thorsten
Content Development | | ThorstenDeska0 -
How does google react to duplicate shops on ecommerce sites
Surely shopping cart sites are going to have a lot of duplicate content? Does google recognise this? Is there anything I can do let google know?
Content Development | | borderbound0 -
Help with Duplicate Content Issue for pages...
I have pages with duplicate content, i want to put them on hold while i write unique content as i do not want to get marked down for it. I also want to keep the urls and use them again.
Content Development | | pauledwards
There are about 300 pages affected by duplicate content currently. Am i best doing 302 redirects as it is temporary? to the origional source of the content, or canonical tags no index? The pages are currently indexed and cahced by google, i want to use the url in the future for unique content to get it valued by Google. Any advice much appreciated. Kind Regards,0 -
Duplicate content
Hello Seomoz team, i'm french and so my english is not very good ;-). I work for a brand site and we publish content about our products. The problem is : as a brand site, many sites that sell our products, copy our content. And we have duplicate content. And since these sites have worked SEO, they put in place rel canonical tag. as a brand, how to avoid being accused by Google duplicate content? tanks for you answer. I hope it's clear. Take care Denis
Content Development | | android_lyon0