Link juice and max number of links clarification
-
I understand roughly that "Link Juice" is passed by dividing PR by the number of links on a page. I also understand the juice available is reduced by some portion on each iteration.
- 50 PR page
- 10 links on page
- 5 * .9 = 4.5 PR goes to each link.
Correct?
If so and knowing Google stops counting links somewhere around 100, how would it impact the flow to have over 100 links?
IE
- 50 PR page
- 150 links on the page
- .33 *.9 = .29PR to each link BUT only for 100 of them.
After that, the juice is just lost?
Also, I assume Google, to the best of its ability, organizes the links in order of importance such that content links are counted before footer links etc.
-
As always in the SEO industry, there's no right answer for any particular case but I think you got a really structured approach to it. It would be great to know the results of your experiment. This could be a really good article in the seomoz community, let me know how it goes!
-
Agreed, the extreme repetition of the brand keywords and anchor text was one of my first arguments for dropping the section.
Think, from everything I've read so far, there appears to be an additional juice loss at one point but it would highly dependent on the trust of the page and the nature of the links. Certainly not a strong enough correlation to make part of my case however.
-
I think that the link #102 may have the same value of link #35, I don't think that adding many links diminishes the value of each one. What I assume however is that:
- having many links in one page diminishes the control you have on them, so google may crawl some of them and give different weight on each one. That0s why I'll better put fewer links
- you're right about having more links to your pages augment the possibility of have thoes pages in a better position against other. However as I said before, beware that google may not crawl all your links all the time. You can achieve the same proiportion of importance with less links (ex. 10 links vs 2 is the same of 100 vs 20: same weight more control and less internal spam risks.
- be wise when you build your links and try to not use too many anchor rich links. Even if you're onsite you don't want to let google think you're trying to overoptimize your page or its backlink profile. Create variations of your anchors and use them all.
-
The question come from a circumstance where 100's of links are contained in a supplemental tab on a product detail page. They link to applications of the product - each being a full product page. On some pages, there are only 40 links, other can be upwards of 1000 as the product is used as a replacement part for many other products.
I am championing the removal of the links, if not the whole tab. On a few pages, it would be useful to humans but clearly not on pages with 100s.
But if Google followed them all, then conceivably it would build a stronger "organic" structure to the catalogue as important products would get 1000's of links - others only a few.
Whatever value this might have, it would be negated if juice leaked faster after 100+ links.
From Matt's article above, "Google might choose not to follow or to index all those links." He also mentions them being a spam signal so I think it still wise to keep them low even if the 100kb limit has been lifted. Clearly there are still ramifications - a concept reinforced by this site's reports and comments.
To my question...from what both of you have said, it doesn't appear there is strong evidence a very high number of links directly causes additional penalty as far as link juice is concerned.
For the record, I'm not calculating PR or stuck on exact counts - my focus always starts with the end user. But, I'd hate to have a structural item that causes undue damage.
-
The context is a parts page where potential hundreds of link could be associate with other parts the item fit. I looking to firm up my argument against the concept so I want to understand better the true impact of the section.
If it was accelerating the decay of link juice, all the more reason. If not, they may actual help certain products appear organically stronger (i.e. a part that fits on a greater number of products will have more incoming links).
Navigation is actually quite tight (under 20 links) by modern standards.
-
As eyepaq said a 100 links limit is not the case anymore, however even if google is able to give value to them all it really makes sense to ahve so many links in your page? Are you using fat footers? Don't rely on that structure to give value to your internal pages, if you find 100 links in one page to be needed for users to navigate through your site try to restructure it a little and create different categories.
I don't know how much value is lost after 100 links but you should try to have tinier and themed list of links adding a further step in your navigation.google won't give hesmae value to those pages as users' won't either.
-
Hi,
You should count those at all. If you get stuck in counting and calculating PR and how much PR is passed from one page to another you will lose focus from what it dose matter. This dosen't.
About the 100 links per page - that was a very old technical limitation from Google's side. There is no longer the case.
See more here: http://www.mattcutts.com/blog/how-many-links-per-page/
and a fast 2 and so min video from Matt Cutts here: http://www.youtube.com/watch?v=l6g5hoBYlf0
So the bottom line is that you should not count and focus on PR and how much PR is passed -only look at things from a normal user and ask your self: dose t his page make sense ? Dose it make sense to have over 100 links on this page ?
Not sure if this was the answer you are looking for but ... hope it helps.
Cheers.
-
I used 'PR' mainly because 'juice points' sounded stupid.
I'm more interested in what happens past the ~100 links.
Does the remaining juice get reallocated or does the page leak at a higher rate?
-
Hi Spry, as you already mentioned, not all links has the same weight, there are navigationla links like in the footer, in the menu; also google may give some different weight among them, moreover some value may be reduced, and also there are some other factors that google uses to weight each link in a page that we don't know, but we may assume they have.
So given that we can calculate an aproximate value of juice passed from a link to another I won't rely so much in PR, the time you're spending in this caluclations may be given to other tasks. In general you may assume that the best pages to obtain links are pages which are nearest to the homepage of a site and which has the least number of outgoing (both internal and external) links.
Don't rely so much on PR, I've seen so many low page rank pages ranking well and high pr pages with no rankings that I think that you need to consider other parameters which are more important when it comes to linkbuilding: age of the domain, authority, topic related, etc etc.
If your calculations are made for onsite optimization just try to have your main pages higher in your site structure and linked directly from the homepage or from m ain categories.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Optimizing internal links or over-optimizing?
For a while I hated the look of the internal links page of Google Web Master Tools account for a certain site. With a total of 120+K pages, the top internal link was the one pointing to "FAQ". With around 1M links. That was due to the fact, on every single page, both the header and the footer where presenting 5 links to the most popular questions. The traffic of those FAQ pages is non-existent, the anchor text is not SEO interesting, and theoretically 1M useless internal links is detrimental for page juice flow. So I removed them. Replacing the anchor with javascript to keep the functionality. I actually left only 1 “pure” link to the FAQ page in the footer (site wide). And overnight, the internal links page of that GWT account disappeared. Blank, no links. Now... Mhhh... I feel like... Ops! Yes I am getting paranoid at the idea the sudden disappearance of 1M internal links was not appreciated by google bot. Anyone had similar experience? Could this be seen by google bot as over-optimizing and be penalized? Did I possibly triggered a manual review of the website removing 1M internal links? I remember Matt Cutts saying adding or removing 1M pages (pages) would trigger a flag at google spam team and lead to a manual review, but 1M internal links? Any idea?
Technical SEO | | max.favilli0 -
Backlink Profile: Should I disavow these links? Auto-Generated Links etc
Hello Moz Community, At first I wanted to say that I really like the Q&A section and that I read and learned a lot - and today it is time for my first own question 😉 I checked our backlink-profile these days and I found in my opinion a few bad/spammy links, most of them are auto-generated by pickung up some (meta) information from our webpage. Now my question is if I should dasavow these links over webmasters or if these links shouldn't matter as I guess basically every webpage will be picked up from them. Especially from the perspective that our rankings dropped significantly last weeks, but I am not sure if this can be the real reason. Examples are pages like: https://www.askives.com/ -Auto-Generates for example meta descriptions with links http://www.websitesalike.com/ -find similar websites http://mashrom.ir/ -no idea about this, really crazy Or we are at http://www.europages.com/, which makes sense for me and we get some referral traffic as well, but they auto-generated links from all their TLDs like .gr / .it / .cn etc. -just disavow all other TLDs than .com? Another example would be links from OM services like: seoprofiler.com Moreover we have a lot of links from different HR portals (including really many outdated job postings). Can these links “hurt” as well? Thanks a lot for your help! Greez Heiko
Technical SEO | | _Heiko_0 -
Link from Blogspot.com subdomain...
I have found an author that has an article about a particular product we sell online. I was thinking of speaking to them about getting a link to our site. But then I looked at the stats: <label>Page:</label><label id="Page Authority" class="key lsdata">PA:1</label><label id="mozrank" class="key lsdata" title="MozRank">mR:0.00</label>mT:0.00<label id="SEOmoz-data-uid">0</label> links from
Technical SEO | | bjs2010
<label id="SEOmoz-data-uipl">0</label> Root Domains<label>Root Domain:</label>**<label id="dom-pageauthority" class="key lsdata" title="Domain Authority">DA: 59</label>**24,797,212 links from
<label id="SEOmoz-data-pid">110,858</label> Domains<label>Subdomain:</label>Its on a subdomain of blogspot.com - and the page is relevant to a particular category and our e-commerce site.Is it worth pursuing the link?Thanks!0 -
Google Shows 24K Links b/w 2 sites that are not linked
Good Morning, Does anyone have any idea why Google WMT shows me that i have 24,101 backlinks from one of my sites ( http://goo.gl/Jb4ng ) pointing to my other site ( http://goo.gl/JgK1e ) ... These sites have zero links between them, as far as I can see/tell. Can someone please help me figure out why Google is showing 24k backlinks? Thanks
Technical SEO | | Prime850 -
How to handle large numbers of comments?
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k! As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing! What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account. Does anyone have any particular recommendations? Options I've considered are: Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!) Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?) Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors) How do active comments on a page contribute to an article's freshness? Any thoughts would be greatly appreciated.
Technical SEO | | DougRoberts2 -
Does Yelp pass link juice?
This is probably a profoundly obvious question, but I can't seem to find an explicit answer on the internet, so I'll ask it here: Yelp's links out to local business websites are not nofollow'd, but they go through a javascript-based redirect. My understanding is that javascript redirected links do not pass link juice, so a link from a yelp profile will not directly impact my page authority; however, it looks like yelp does use nofollow judiciously for internal links, so I don't understand why they would allow follow for these "useless" outbound links. Do yelp's javascript-redirected links pass link juice?
Technical SEO | | tvkiley0 -
What Are The Page Linking Options?
Okay, so I'm working with a site that has pretty good domain authority, but the interior pages are not well linked or optimized. So, it ranks for some terms, but everything goes to the home page. So, I'd like to increase the authority of the interior pages. The client is not wild about spreading targeted juice via something like a footer. They also don't like a "Popular Searches" style link list. The objection is that it's not attractive. They currently use cute euphemisms as the linking text, so like "cool stuff" instead of "sheepskin seat covers." In that made up example, they'd like to rank for the latter, but insist on using the former. What about a slide show with alt text/links? Would that increase the authority of those pages in a term-targeted kinda way? Are there other options? Does it matter how prominent those links are, like footers not as good as something higher up the page? They currently use a pull-down kind of thing that still results in some pages having no authority. Do bots use that equally well? Thanks!
Technical SEO | | 945010