How Google treat internal links with rel="nofollow"?
-
Today, I was reading about NoFollow on Wikipedia. Following statement is over my head and not able to understand with proper manner.
"Google states that their engine takes "nofollow" literally and does not "follow" the link at all. However, experiments conducted by SEOs show conflicting results. These studies reveal that Google does follow the link, but does not index the linked-to page, unless it was in Google's index already for other reasons (such as other, non-nofollow links that point to the page)."
It's all about indexing and ranking for specific keywords for hyperlink text during external links. I aware about that section. It may not generate in relevant result during any keyword on Google web search.
But, what about internal links? I have defined rel="nofollow" attribute on too many internal links.
I have archive blog post of Randfish with same subject. I read following question over there.
Q. Does Google recommend the use of nofollow internally as a positive method for controlling the flow of internal link love? [In 2007]
A: Yes – webmasters can feel free to use nofollow internally to help tell Googlebot which pages they want to receive link juice from other pages
_
(Matt's precise words were: The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity. Plenty of other mechanisms would also work (e.g. a link through a page that is robot.txt'ed out), but nofollow on individual links is simpler for some folks to use. There's no stigma to using nofollow, even on your own internal links; for Google, nofollow'ed links are dropped out of our link graph; we don't even use such links for discovery. By the way, the nofollow meta tag does that same thing, but at a page level.)Matt has given excellent answer on following question. [In 2011]
Q: Should internal links use rel="nofollow"?
A:Matt said:
"I don't know how to make it more concrete than that."
I use nofollow for each internal link that points to an internal page that has the meta name="robots" content="noindex" tag. Why should I waste Googlebot's ressources and those of my server if in the end the target must not be indexed? As far as I can say and since years, this does not cause any problems at all.
For internal page anchors (links with the hash mark in front like "#top", the answer is "no", of course.
I am still using nofollow attributes on my website.
So, what is current trend? Will it require to use nofollow attribute for internal pages?
-
Even if you don’t want a page to rank,
Page rank is ranking factor? I don't think so... I am not opposing you but in my category there are many websites which are performing well with low page rank. And, high page rank website is still at bottom.
Have you any idea about it?
-
First I mast sure you understand, that no-follow still leaks link juice, it just does not pass it to the linked page.
there was a time where you could stop leaking link juice by using no-follow, but not any more.
So using no-follow’s is generally not a good idea. If you do you are wasting link juice. Even if you don’t want a page to rank, you are better off letting the juice flow and have a link on the linked to page pointing back to your home page or any other page you want to rank.
As for no-follow and the fact that Google still follows. They don’t follow though that link, but they may get to the page from another link, or they may already have the url in their index.
You can put a no-follow meta tag in a page or a no-index. A no-follow meta tag, will allow Search Engines to crawl the page, but will not give link juice to any pages you have linked from that page, but as I stated, you will not keep the link juice, it will just evaporate. A no index will stop SE’s from indexing that page
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do after a failed request for validation (error with noindex, nofollow) in new Google Search Console?
Hi guys, We have the following situation: After an error message in new google search console for a large amount of pages with noindex, nofollow tag, a validation is requested before the problem is fixed. (it's incredibly stupid decision taken before asking the SEO team for advice) Google starts the validation, crawls 9 URLs and changes the status to "Failed". All other URLs are still in "pending" status. The problem has been fixed for more than 10 days, but apparently Google doesn't crawl the pages and none of the URLs is back in the index. We tried pinging several pages and html sitemaps, but there is no result. Do you think we should request for re-validation or wait more time? It there something more we could do to speed up the process?
Intermediate & Advanced SEO | | ParisChildress0 -
How to stop internal Dynamically created links that generate 404s
Hi I have question weather to block in Robots.txt or through No followed tags. Links that are part of a Dynamic Tab control that generates the tabs based on CMS content. These are internal links on the page which show content within a tab control. Would it stop these errors appearing in the Google report if we added a rel="nofollow" tag to the <a>tag, for example:</a> <a></a> <a></a>VIEW FROM SEATS? Is it better to use No followed tags or block it in Robots.txt? Thanks Cee
Intermediate & Advanced SEO | | mara.lature10 -
Some site's links look different on google search. For example Games.com › Flash games › Decoration games How can we do our url's like this?
For example Games.com › Flash games › Decoration games How can we do our url's like this?
Intermediate & Advanced SEO | | lutfigunduz0 -
"No Index, No Follow" or No Index, Follow" for URLs with Thin Content?
Greetings MOZ community: If I have a site with about 200 thin content pages that I want Google to remove from their index, should I set them to "No Index, No Follow" or to "No Index, Follow"? My SEO firm has advised me to set them to "No Index, Follow" but on a recent MOZ help forum post someone suggested "No Index, No Follow". The MOZ poster said that telling Google the content was should not be indexed but the links should be followed was inconstant and could get me into trouble. This make a lot of sense. What is proper form? As background, I think I have recently been hit with a Panda 4.0 penalty for thin content. I have several hundred URLs with less than 50 words and want them de-indexed. My site is a commercial real estate site and the listings apparently have too little content. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
How Does Google Treat Date Ranges For a Specific Keyword or Query?
How are date ranges interpreted by Google - ie if you type "1993-2003" does Google know 1995 is incl. and should be referenced for a query? What is the best practice for an ecomm site when it comes to a landing page for multiple years? Should be list out each year (looks spammy, "2003,2004,2005...), go with a full range (1993-2003 ), or is a two digit range suffice (88-95)?
Intermediate & Advanced SEO | | andrewv0 -
Building "keyword" backlinks
Looking for some opinions here please. Been involved in seo for a couple of years mainly working on my websites and picking up the odd client here and there through word of mouth. I must admit that up until a few months back I was guilty of using some grey methods of link building - linkvana, unique article wizard and the such. While no penalties were handed out to my domains and some decent rankings gained, I got tired of always being on the lookout for what the next Google update will do to my results and which networks were being hit, and so I moved a lot more into the 'proper' way of seoing. These days my primary sources for backlinks are much more respectable... myblogguest bloggerlinkup postjoint Guest Blog Finder http://ultramarketer.com/guest-blogger-finder/ - not sure where i came across this resource but it's very handy I use these sources alongside industry only directories and general word of mouth. Ironically I have found that doing the word by hand not only leads to results I can happyily show people (content wise) but also it's much quicker and cheaper. The increased authority of the sites means far fewer links are needed. The one area I still am having a little issue with is that of building keyword based backlinks. I now find it fairly easy to get my content on a reasonable quality site - DA to 40 and above, however the vast majority of these sites will allow the backlink only as the company name or as a generic read more type thing. This is fine and it is improving my website performance and authority. The trouble I am finding is that while i am ranking for the title tag and some keywords in the page, I am struggling to get backlinks for other keywords. In an ideal world every page on the site would be optimised for a different keyword and you could then just the site name as anchor text to build the authority of that page and make it rank for it's content, but what about when you (or the client) wants to rank the home for a number of different keywords, some not featured on the page. The keywords are too similar to go to the trouble of making unique pages for, and that would also add no value to the site. My question really then, after a very long winded way of getting there, is are others finding it much more difficult to gain keyword based backlinks these days? The great thing about the grey seo tools, as mentioned above, is that it was super easy to get the backlinks with whatever anchor text you wanted - even if you needed hundreds of the thing to compensate for the low value of each!! Thanks Carl
Intermediate & Advanced SEO | | GrumpyCarl0 -
When Google's WMT shows thousands of links from a single domain... Should they be removed?
Hi, Looking at Google's WMT "links to your site" it shows few sites that have thousands of links pointing to mine. There are actually only 1-2 links pointing to me from a site that Google shows 2000.
Intermediate & Advanced SEO | | BeytzNet
I assume that it is simply because they don't have canonical tags. Should I ask for the 2 links to be removed? Thanks0 -
How do I reduce internal links & cannibalisation from primiary navigation?
SEOmoz tools is reporting each page on our site containing in excess of 200 internal links mostly from our primary navigation menu which it says is too many. This also causes cannibalization on the word towels which i would like to avoid if possible. Is there a way to reduce the number of internal links whilst maintaining a good structure to allow link juice to filter through the site and also reduce cannibalization?
Intermediate & Advanced SEO | | Towelsrus0