Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Prevent link juice to flow on low-value pages
-
Hello there!
Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages.
I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO.
What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this?
- Put a rel="nofollow" attribute on those links?
- Put a "robots" meta tag containing "noindex,nofollow" on those pages?
- Put a "Disallow" for those pages in a "robots.txt" file?
- Use efficient Javascript links? (that crawlers won't be able to follow)
-
Mmh, good point. Never heard that "privacy policy page" could be a trust signal. Is there an article somewhere that talks about this?
Well, I took those two pages as an example... but my question was about avoiding link juice to flow on non-SEO pages in general.
Thanks a lot for your answers!
-
Exactly, and what I also try to explain to people is that privacy policy type page is additional signal for Google when they try to understand the type of site you are and how trustworthy it is. Why in the world would you noindex something like that?
-
As I understand it nofollow still dilutes your link juice even though it does not pass PR (theoretically).
Google made this announcement to combat PR sculpting in 2009. Here is a post from Rand about it.
Unlsee something has changed that I am not aware of you could link in an iFrame and Google will not see it, nor will it dilute your PR passed out.
-
Great suggestions. I've recently combined some pages (login/register, about/contact/ToS/privacy, and a few others) and have been very happy with the results. I removed 8 links from every page.
I am also thinking about removing some more links from my product pages, to try and keep the most juice on those pages. Those pages don't need the same navigation as the homepage.
-
It depends on what your purpose is.
If you want them totally block from being index then putting the page in the robots.tx fil or using a robots meta tag would work fine.
If you just want to de-emphasize the page to the search engines you can use nofollows or javascript links on footer/header links.
One thing that we have done is to combine some of these pages (terms and privacy) into one page to cut down on the number of total links on each page.
You could also not include the privacy page link on every page (depending on your site) but just link it from certain pages that collect sensitive data (near the form).
I hope this helps. The main thing to remember is that each site is different so you will have to adjust your tactics depending on precisely what you are trying to accomplish.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Optimization Error
Hi, I am trying to track a page optimization feature for one of my project, https://shinaweb.com but i keep getting this below error: "PAGE OPTIMIZATION ERROR
On-Page Optimization | | shinawebnavid
There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page." I checked robots.txt file, it all looks fine. Not sure what is the problem? Is it a problem with Moz or the website?0 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
Link flow for multiple links to same URL
Hi there,
On-Page Optimization | | doctecs
my question is as follows: How does Google handle link flow if two links in a given page point to the same URL? (do they flow link individually or not?) This seems to be a newbie question, but actually it seems that there is little evidence and even also little consensus in the SEO community about this detail. Answers should include source Information about the current state of art at Google is preferable The question is not about anchor text, general best practises for linking, "PageRank is dead" etc. We do know that the "historical" PageRank was implemented (a long time ago) without special handling for multiple links, as e.g. last stated by Matt Cutts in this video: http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 On the other hand, many people from the SEO community say that only the first link counts. But so far I could not find any data to back this up, which is quite surprising.0 -
Too many links on page -- how to fix
We are getting reports that there are too many links on most of the pages in one of the sites we manage. Not just a few too many... 275 (versus <100 that is the target). The entire site is built with a very heavy global navigation, which contains a lot of links -- so while the users don't see all of that, Google does. Short of re-architecting the site, can you suggest ways to provide site navigation that don't violate this rule?
On-Page Optimization | | novellseo2 -
Mega Menus? A good or bad idea for link juice.
Hi Just wondering what people think of using mega menus for navigation? We have used them on our new site http://nicontrols.com/uk/ When I run the site through the excellent SEOMoz campaign tools I see that we have too many on page links. I now believe the menu is good for customers but maybe not for link juice. Anyone got an ideas? Do I remove the mega menu or just reduce the number of links? Many thanks David
On-Page Optimization | | DavidLenehan0 -
301 redirects from several sub-pages to one sub-page
Hi! I have 14 sub-pages i deleted earlier today. But ofcourse Google can still find them, and gives everyone that gives them a go a 404 error. I have come to the understading that this wil hurt the rest of my site, at least as long as Google have them indexed. These sub-pages lies in 3 different folders, and i want to redirect them to a sub-page in a folder number 4. I have already an htaccess file, but i just simply cant get it to work! It is the same file as i use for redirecting trafic from mydomain.no to www.mydomain.no, and i have tried every kind of variation i can think of with the sub-pages. Has anyone perhaps had the same problem before, or for any other reason has the solution, and can help me with how to compose the htaccess file? 🙂 You have to excuse me if i'm using the wrong terms, missing something i should have seen under water while wearing a blindfold, or i am misspelling anything. I am neither very experienced with anything surrounding seo or anything else that has with internet to do, nor am i from an englishspeaking country. Hope someone here can light up my path 🙂 Thats at least something you can say in norwegian...
On-Page Optimization | | MarieA1 -
Creating New Pages Versus Improving Existing Pages
What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)?
On-Page Optimization | | SparkplugDigital0 -
SEO value of "in the news" links on home page?
Notice more sites have an "in the News" section on the home page, or something similar like press releases... Apart from providing users fresh content, is there an SEO value to this? What is the explanation for this? Have a feeling the answer is obvious but just not too sure Thanks a lot.
On-Page Optimization | | inhouseninja0