Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Link Juice + multiple links pointing to the same page
-
Scenario
The website has a menu consisting of 4 linksHome | Shoes | About Us | Contact Us
Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes
In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links.Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact?Any other advise or best practice would be appreciated.
Thanks Mark
-
Hi Remus & Kurt,
Thank you for your advise.
Mark
-
Remus's answer is good. I would add to that that Google has their first link filter. If you have two links pointing from page A to page B, Google only passes link authority (pagerank) and reputation (keywords in the anchor text and relevant surrounding text) through the first link that appears in the code. The second link does not pass anything. So, whatever the anchor text of the first link in the code is, that's the anchor text Google is going to use (Remus is right that anchor text has become less important).
The second link does, however, dilute the amount of pagerank passed. So, like Remus pointed out, each link in your scenario only passes 20% of the pagerank. Since Google ignores the second link to the shoe page, that 20% of pagerank does not get passed. I'm not sure if it stays on the page or just gets lost.
So, what does this all mean? From an SEO standpoint, you want the link with the targeted keyword to be first in the code if you have more than one link to a page. Also, you don't really want to have two links to the same page on that one page. Now, that's from an SEO perspective. From a user perspective, it may make perfect sense to have that second link and the page may convert better. So, you'd just have to decide which is more important...and it's probably the user perspective that's more important.
Kurt Steinbrueck
OurChurch.Com -
Hi Mark, really good questions.
- How many links would Google count as part of the link juice model?
There are a lot of opinions about this subject and there is no clear answer (it's really hard to test). Some time ago Google removed the effect of "nofollow" attribute for internal links, to cut the advantage SEO's gained by "pagerank sculpting". I think they did this so that search engine optimizers don't have a big advantage over standard websites. My personal opinion is that in terms of link juice lost Google would count 5, but the page benefiting won't get double the value. I think Google would only count the advantages of one of those links, whichever the best (probably the one in content. But on the other side, the link juice lost is not so important. The rest of the pages won't necessarily rank for popular terms.
I think that in-content links get way more advantages than just the "juice" and anchor text. The neighboring text is also important, the fact that it's in a block of text it's also important. Also it brings value to the users, who, might want to see all the shoes models when reading about them. I think you should definitely use this approach but just make sure you don't take it to an extreme.
-
20% to each link, but the shoes page won't get 20x2 from those 2 duplicate, maybe it will get 25 + some other advantages (personal oppinion!)
-
Changing anchor text had some effect in the past, but recently anchor text has less and less importance. It probably still has value. It's still an important ranking factor for 2013, and I would use it if I was you. But I would bring it to a new level. I would also think about the words in the context of the link. Try to link from all the relevant sections of the websites, and as you point to the shoes page from different contexts, naturally, the anchor text will change. For example you could link through our "shoe collection" from an article which compares between your shoes and competitor shoes.
I wrote an article for YouMoz a few years ago, some concepts might be a bit outdated because the ranking factors changed a lot since then. However it might give you some ideas to explore from a new perspective
-> An Intelligent Way to Plan Your Internal Linking Structure
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fresh page versus old page climbing up the rankings.
Hello, I have noticed that if publishe a webpage that google has never seen it ranks right away and usually in a descend position to start with (not great but descend). Usually top 30 to 50 and then over the months it slowly climbs up the rankings. However, if my page has been existing for let's say 3 years and I make changes to it, it takes much longer to climb up the rankings Has someone noticed that too ? and why is that ?
Intermediate & Advanced SEO | | seoanalytics0 -
Will I lose Link Juice when implementing a Reverse Proxy?
My company is looking at consolidating 5 websites that it has running on magento, wordpress, drupal and a few other platforms on to the same domain. Currently they're all on subdomains but we'd like to consolidate the subdomains to folders for UX and SEO potential. Currently they look like this: shop.example.com blog.example.com uk.example.com us.example.com After the reverse proxy they'll look like this: example.com/uk/ example.com/us/ example.com/us/shop example.com/us/blog I'm curious to know how much link juice will be lost in this switch. I've read a lot about site migration (especially the Moz example). A lot of these guides/case studies just mention using a bunch of 301's but it seems they'd probably be using reveres proxies as well. My questions are: Is a reverse proxy equal to or worse/better than a 301? Should I combine reverse proxy with a 301 or rel canonical tag? When implementing a reverse proxy will I lose link juice = ranking? Thanks so much! Jacob
Intermediate & Advanced SEO | | jacob.young.cricut0 -
Pages with excessive number of links
Hi all, I work for a retailer and I've crawled our website with RankTracker for optimization suggestions. The main suggestion is "Pages with excessive number of links: 4178" The page with the largest amount of links has 634 links (627 internal, 7 external), the lowest 382 links (375 internal, 7 external). However, when I view the source on any one of the example pages, it becomes obvious that the site's main navigation header contains 358 links, so every new page starts with 358 links before any content. Our rivals and much larger sites like argos.co.uk appear to have just as many links in their main navigation menu. So my questions are: 1. Will these excessive links really be causing us a problem or is it just 'good practice' to have fewer links
Intermediate & Advanced SEO | | Bee159
2. Can I use 'no follow' to stop Google etc from counting the 358 main navigation links
3. Is have 4000+ pages of your website all dumbly pointing to other pages a help or hindrance?
4. Can we 'minify' this code so it's cached on first load and therefore loads faster? Thank you.0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Different Header on Home Page vs Sub pages
Hello, I am an SEO/PPC manager for a company that does a medical detox. You can see the site in question here: http://opiates.com. My question is, I've never heard of it specifically being a problem to have a different header on the home page of the site than on the subpages, but I rarely see it either. Most sites, if i'm not mistaken, use a consistent header across most of the site. However, a person i'm working for now said that she has had other SEO's look at the site (above) and they always say that it is a big SEO problem to have a different header on the homepage than on the subpages. Any thoughts on this subject? I've never heard of this before. Thanks, Jesse
Intermediate & Advanced SEO | | Waismann0 -
301 redirection pointing to noindexed pages
I have rather an unusual situation where a recently launched affiliate site does not have any unique content as its all syndicated content. For that reason we are currently using the noindex,nofollow meta tags to keep the pages out of the search engines index until we create unique content for the pages. The problem is that due to a very tight timeframe with rebranding, we are looking at 301 redirecting (on a page to page basis) another high authority legacy domain to this new site before we have had a chance to add unique content to it and remove the noindex,nofollow tags. I would assume that any link authority normally passed through the 301 would be lost in this scenario but Im uncertain of what the broader impact might be. Has anyone dealt with a similar scenario? I know this scenario is not ideal and I would rather wait until the unique content is up and noindex tags are removed before launching the 301 redirect of the legacy domain but there are a number of competing priorities at play outside of SEO.
Intermediate & Advanced SEO | | LosNomads0 -
Duplicate internal links on page, any benefit to nofollow
Link spam is naturally a hot topic amongst SEO's, particularly post Penguin. While digging around forums etc, I watched a video blog from Matt Cutts posted a while ago that suggests that Google only pays attention to the first instance of a link on the page As most websites will have multiple instances of a links (header, footer and body text), is it beneficial to nofollow the additional instances of the link? Also as the first instance of a link will in most cases be within the header nav, does that then make the content link text critical or can good on page optimisation be pulled from the title attribute? I would appreciate the experiences and thoughts Mozzers thoughts on this thanks in advance!
Intermediate & Advanced SEO | | JustinTaylor880