Understanding No Follow
-
We manage a couple of sites with 100s of pages...
Most of the sites have content that is not helpful as landing pages but obviously has relevent content related to our desired search terms.
Some of links go off site to another domain.
I am trying to understand the issue of "link juice" and if I gain it or lose it by putting "nofollow" designation on some of the page links.
Specifically, do I increase the value of my pages if I put no follow tags on lower tier links off of these pages.
Here is a page in question - http://www.vahmarketing.com/product/ductless-hoods
Is there a best practice or SEO rule for using "no follow"?
Thanks,
Bob Nance
-
Hi Bob,
I think you are asking about PageRank Sculpting. There was talk awhile ago about this no longer being practical and that the juice was lost due to changes in the algo. I think SEOmoz ran some tests on this that were inconclusive... so generally I think opinion is still PageRank Sculpting is dead.
Maybe look up some stuff on faceted navigation to deal with the issues you may be having with the site.
Hope this helps.
G
-
Hello Bob,
The nofollow keeps your site from passing link juice through the link. We have this option for situations where we either don't want the link juice to flow to the other site, or we don't know who the other site is (ex. comment links). This does not mean that the link will not be followed by the search engine.
If it is a shady site, I would use a nofollow, but I would also question why I am linking to it. It does not hurt your site to have follow links unless they are questionable. I have seen many sites with decent PR that have link pages with 100's of links on them.
In your example I don't see where you are linking to other domains. Are they your other domains? Are they companies that don't need anything. I have not had an instance where linking to a good company has hurt anything.
Is there a best practice?
If it's a good link, let the juice flow, if not, question why you are linking to it.
Linking to a site that is considered an authority or brand in the niche will align your site with that site and could help you in the long run. Birds of a feather.
There are other ways to link to sites without using the a href such as by incorporating javascript, but in a situation like yours I would not even think about it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal no follow links
I have just discovered that the WordPress theme I have been using for some time has no follow internal links on the blog. Simply put each post has an image and text link plus a 'read more'. The Read more is a no-follow which is also on my homepage. The developer is saying duplicate follow links are worse than an internal no follow. What is your opinion on this? Should I spend time removing the no follow?
Technical SEO | | Libra_Photographic0 -
Will adding a mini directory to our blog with lots of outgoing no follow links harm our authority and context
We are an adventure travel tour company, who run hiking, kayaking, biking adventures in several countries. We have a travel tour operator website with a blog in a sub folder of the site. We want to add a section/category in the blog itself with a hiking club mini directory, that lists all hiking clubs in 1 or 2 specific countries. The reason we want to do this is because the people searching online for these clubs are our target market and potential clients. We hope to rank for some of these searches, and encourage interest in our blog/website in the process. We also want the potential to build relationships with these clubs. The question I want to ask is: if we add say 100 to 200 listings, and make all outgoing links no follow, will this harm our page authority, reputation with SE's or pose any other risk for our site. The other question is, do you think that this will dilute the context of our content - as its slightly different in context to the rest of our site content. Are we better to set up a separate site for this purpose.
Technical SEO | | activenz1 -
Can Anybody Understand This ?
Hey guyz,
Technical SEO | | atakala
These days I'm reading the paperwork from sergey brin and larry which is the first paper of Google.
And I dont get the Ranking part which is: "Google maintains much more information about web documents than typical search engines. Every hitlist includes position, font, and capitalization information. Additionally, we factor in hits from anchor text and the PageRank of the document. Combining all of this information into a rank is difficult. We designed our ranking function so that no particular factor can have too much influence. First, consider the simplest case -- a single word query. In order to rank a document with a single word query, Google looks at that document's hit list for that word. Google considers each hit to be one of several different types (title, anchor, URL, plain text large font, plain text small font, ...), each of which has its own type-weight. The type-weights make up a vector indexed by type. Google counts the number of hits of each type in the hit list. Then every count is converted into a count-weight. Count-weights increase linearly with counts at first but quickly taper off so that more than a certain count will not help. We take the dot product of the vector of count-weights with the vector of type-weights to compute an IR score for the document. Finally, the IR score is combined with PageRank to give a final rank to the document. For a multi-word search, the situation is more complicated. Now multiple hit lists must be scanned through at once so that hits occurring close together in a document are weighted higher than hits occurring far apart. The hits from the multiple hit lists are matched up so that nearby hits are matched together. For every matched set of hits, a proximity is computed. The proximity is based on how far apart the hits are in the document (or anchor) but is classified into 10 different value "bins" ranging from a phrase match to "not even close". Counts are computed not only for every type of hit but for every type and proximity. Every type and proximity pair has a type-prox-weight. The counts are converted into count-weights and we take the dot product of the count-weights and the type-prox-weights to compute an IR score. All of these numbers and matrices can all be displayed with the search results using a special debug mode. These displays have been very helpful in developing the ranking system. "0 -
Noindex, follow duplicate pages
I have a series of websites that all feature a library of the same content. These pages don't make up the majority of the sites content, maybe 10-15% of the total pages. Most of our clients won't take the time to rewrite the content, but it's valuable to their site. So I decided to noindex, follow all of the pages. Outside of convincing them all to write their own versions of the content, is this the best method? I could also block the pages with robots.txt, but then I couldn't pass any link juice through the pages. Any thoughts?
Technical SEO | | vforvinnie0 -
If multiple links on a page point to the same URL, and one of them is no-followed, does that impact the one that isn't?
Page A has two links on it that both point to Page B. Link 1 isn't no-follow, but Link 2 is. Will Page A pass any juice to Page B?
Technical SEO | | Jay.Neely0 -
Can I reduce link count by no following links?
Hi, A large number of my pages contain over 100 links. This is due to a large drop down navigation which is on every page. To reduce my link count could I just no follow these navigation links or would I have to remove the navigation completely?
Technical SEO | | moesian0 -
Please help to identify the following bots and spiders
Hello all, I would appreciate any help in identifying the following bots: Vagabondo/4.0 TwengaBot-2.0 FatBot 2.0 Googlebot/2.1 bingbot/2.0 Baiduspider/2.0 Yahoo! Slurp SeznamBot/3.0 ShopWiki/1.0 MJ12bot/v1.4.0 YandexBot/3.0 Sosospider+ Ezooms/1.0 Gigabot/3.0 Thanks Shehzad
Technical SEO | | Gareth_Cartman0 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0