Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Prevent link juice to flow on low-value pages
-
Hello there!
Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages.
I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO.
What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this?
- Put a rel="nofollow" attribute on those links?
- Put a "robots" meta tag containing "noindex,nofollow" on those pages?
- Put a "Disallow" for those pages in a "robots.txt" file?
- Use efficient Javascript links? (that crawlers won't be able to follow)
-
Mmh, good point. Never heard that "privacy policy page" could be a trust signal. Is there an article somewhere that talks about this?
Well, I took those two pages as an example... but my question was about avoiding link juice to flow on non-SEO pages in general.
Thanks a lot for your answers!
-
Exactly, and what I also try to explain to people is that privacy policy type page is additional signal for Google when they try to understand the type of site you are and how trustworthy it is. Why in the world would you noindex something like that?
-
As I understand it nofollow still dilutes your link juice even though it does not pass PR (theoretically).
Google made this announcement to combat PR sculpting in 2009. Here is a post from Rand about it.
Unlsee something has changed that I am not aware of you could link in an iFrame and Google will not see it, nor will it dilute your PR passed out.
-
Great suggestions. I've recently combined some pages (login/register, about/contact/ToS/privacy, and a few others) and have been very happy with the results. I removed 8 links from every page.
I am also thinking about removing some more links from my product pages, to try and keep the most juice on those pages. Those pages don't need the same navigation as the homepage.
-
It depends on what your purpose is.
If you want them totally block from being index then putting the page in the robots.tx fil or using a robots meta tag would work fine.
If you just want to de-emphasize the page to the search engines you can use nofollows or javascript links on footer/header links.
One thing that we have done is to combine some of these pages (terms and privacy) into one page to cut down on the number of total links on each page.
You could also not include the privacy page link on every page (depending on your site) but just link it from certain pages that collect sensitive data (near the form).
I hope this helps. The main thing to remember is that each site is different so you will have to adjust your tactics depending on precisely what you are trying to accomplish.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category pages, should I noindex them?
Hi there, I have a question about my blog that I hope you guys can answer. Should I no index the category and tag pages of my blog? I understand they are considered as duplicate content, but what if I try to work the keyword of that category? What would you do? I am looking forward to reading your answers 🙂
On-Page Optimization | Apr 1, 2019, 1:03 PM | lucywrites0 -
Is it better to keep a glossary or terms on one page or break it up into multiple pages?
We have a very large glossary of over 1000 industry terms on our site with links to reference material, embedded video, etc. Is it better for SEO purposes to keep this on one page or should we break it up into multiple pages, a different page for each letter for example? Thanks.
On-Page Optimization | Dec 14, 2016, 2:14 PM | KenW0 -
Will it upset Google if I aggregate product page reviews up into a product category page?
We have reviews on our product pages and we are considering averaging those reviews out and putting them on specific category pages in order for the average product ratings to be displayed in search results. Each averaged category review would be only for the products within it's category, and all reviews are from users of the site, no 3rd party reviews. For example, averaging the reviews from all of our boxes products pages, and listing that average review on the boxes category page. My question is, will this be doing anything wrong in the eyes of Google, and if so how so? -Derick
On-Page Optimization | Sep 30, 2016, 8:28 PM | Deluxe0 -
How many outbound links is too many outbound links?
As a part of our SEO strategy, we have been focusing on writing several high quality articles with unique content. In these articles we regularly link to other websites when they are high quality, authoritative sites. Typically, the articles are 500 words or more and have 3-5 outbound links, but in some cases there are as many as 7 or 8 outbound links. Before we get too carried away with outbound links, I wanted to get some opinions on how many outbound links we should be trying to include and more information on how the outbound links work. Do they pass our website's authority on to the other website? Could our current linking strategy cause future SEO problems? Finally, do you have any suggestions for guidelines we should be using? Thank you for your help!
On-Page Optimization | Feb 19, 2013, 8:59 PM | airnwater0 -
Too many links on page -- how to fix
We are getting reports that there are too many links on most of the pages in one of the sites we manage. Not just a few too many... 275 (versus <100 that is the target). The entire site is built with a very heavy global navigation, which contains a lot of links -- so while the users don't see all of that, Google does. Short of re-architecting the site, can you suggest ways to provide site navigation that don't violate this rule?
On-Page Optimization | Aug 17, 2012, 5:17 PM | novellseo2 -
Do we have too many links in our footer?
Hi guys, we have 41 links on our holiday(vacation) rental website, this seems too many when looking at best practice. 24 of these are links to community pages while 8 link to activities pages. The community and activity pages are also accessible from links on the top menu so they are not strictly necessary but do get 10% of site clickthroughs according to Google in-page analytics. I therefore do not want to remove the links if there is no good evidence that google will penalize us for this. What do you think would be best for our site? Thanks, John Tulley. footer.jpg
On-Page Optimization | Jul 27, 2012, 1:58 PM | JohnTulley0 -
Creating New Pages Versus Improving Existing Pages
What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)?
On-Page Optimization | May 28, 2011, 11:22 PM | SparkplugDigital0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | Jun 2, 2013, 5:08 PM | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5