Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Link Juice + multiple links pointing to the same page
-
Scenario
The website has a menu consisting of 4 linksHome | Shoes | About Us | Contact Us
Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes
In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links.Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact?Any other advise or best practice would be appreciated.
Thanks Mark
-
Hi Remus & Kurt,
Thank you for your advise.
Mark
-
Remus's answer is good. I would add to that that Google has their first link filter. If you have two links pointing from page A to page B, Google only passes link authority (pagerank) and reputation (keywords in the anchor text and relevant surrounding text) through the first link that appears in the code. The second link does not pass anything. So, whatever the anchor text of the first link in the code is, that's the anchor text Google is going to use (Remus is right that anchor text has become less important).
The second link does, however, dilute the amount of pagerank passed. So, like Remus pointed out, each link in your scenario only passes 20% of the pagerank. Since Google ignores the second link to the shoe page, that 20% of pagerank does not get passed. I'm not sure if it stays on the page or just gets lost.
So, what does this all mean? From an SEO standpoint, you want the link with the targeted keyword to be first in the code if you have more than one link to a page. Also, you don't really want to have two links to the same page on that one page. Now, that's from an SEO perspective. From a user perspective, it may make perfect sense to have that second link and the page may convert better. So, you'd just have to decide which is more important...and it's probably the user perspective that's more important.
Kurt Steinbrueck
OurChurch.Com -
Hi Mark, really good questions.
- How many links would Google count as part of the link juice model?
There are a lot of opinions about this subject and there is no clear answer (it's really hard to test). Some time ago Google removed the effect of "nofollow" attribute for internal links, to cut the advantage SEO's gained by "pagerank sculpting". I think they did this so that search engine optimizers don't have a big advantage over standard websites. My personal opinion is that in terms of link juice lost Google would count 5, but the page benefiting won't get double the value. I think Google would only count the advantages of one of those links, whichever the best (probably the one in content. But on the other side, the link juice lost is not so important. The rest of the pages won't necessarily rank for popular terms.
I think that in-content links get way more advantages than just the "juice" and anchor text. The neighboring text is also important, the fact that it's in a block of text it's also important. Also it brings value to the users, who, might want to see all the shoes models when reading about them. I think you should definitely use this approach but just make sure you don't take it to an extreme.
-
20% to each link, but the shoes page won't get 20x2 from those 2 duplicate, maybe it will get 25 + some other advantages (personal oppinion!)
-
Changing anchor text had some effect in the past, but recently anchor text has less and less importance. It probably still has value. It's still an important ranking factor for 2013, and I would use it if I was you. But I would bring it to a new level. I would also think about the words in the context of the link. Try to link from all the relevant sections of the websites, and as you point to the shoes page from different contexts, naturally, the anchor text will change. For example you could link through our "shoe collection" from an article which compares between your shoes and competitor shoes.
I wrote an article for YouMoz a few years ago, some concepts might be a bit outdated because the ranking factors changed a lot since then. However it might give you some ideas to explore from a new perspective -> An Intelligent Way to Plan Your Internal Linking Structure
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local SEO - ranking the same page for multiple locations
Hi everyone, I am aware that issue of local SEO has been approached numerous times, but the situation that I'm dealing with is slightly different, so I'd love to receive your expert advice. I'm running the website of a property management company which services multiple locations (www.homevault.com). From our local offices in the city center, we also service neighboring towns and communities ( ex: we have an office in Charlotte NC, from which we service Charlotte plus a dozen other towns nearby). We wanted to avoid creating dozens of extra local service pages, particularly since our offers are identical per metropolitan area and we're talking of 20-30 additional local pages for each area. Instead, we decided to create local service pages only for the main locations. Needless to say, we're now ranking for the main locations, but we're missing on all searches for property management in neighboring towns (we're doing good on searches such as 'charlotte property management', but we're practically invisible for 'davidson property management', although we're searvicing that area as well). What we've done so far to try and fix the situation: 1. The current location pages do include descriptions of areas that we serve. 2. We've included 1-2 keywords for the sattelite locations in the main location pages, but we're nowhere near the optimization needed to rank for local searches in neighboring towns (ie, some main local service pages rank on pages 2-4 for sattelite towns, so not good enough). 3. We've included the searviced areas in our local GMBs, directories, social media profiles etc. None of these solutions appear to work great. Should I go ahead and create the classic local pages for each and every town and optimize them on those particular keywords, even if the offer is practically the same, and the number of pages risks going out of control? Any other better ideas? Many thanks in advance!
Intermediate & Advanced SEO | | HomeVaultPM0 -
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Can Google Bot View Links on a Wix Page?
Hi, The way Wix is configured you can't see any of the on-page links within the source code. Does anyone know if Google Bots still count the links on this page? Here is the page in question: https://www.ncresourcecenter.org/business-directory If you do think Google counts these links, can you please send me URL fetcher to prove that the links are crawlable? Thank you SO much for your help.
Intermediate & Advanced SEO | | Fiyyazp0 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
301 redirection pointing to noindexed pages
I have rather an unusual situation where a recently launched affiliate site does not have any unique content as its all syndicated content. For that reason we are currently using the noindex,nofollow meta tags to keep the pages out of the search engines index until we create unique content for the pages. The problem is that due to a very tight timeframe with rebranding, we are looking at 301 redirecting (on a page to page basis) another high authority legacy domain to this new site before we have had a chance to add unique content to it and remove the noindex,nofollow tags. I would assume that any link authority normally passed through the 301 would be lost in this scenario but Im uncertain of what the broader impact might be. Has anyone dealt with a similar scenario? I know this scenario is not ideal and I would rather wait until the unique content is up and noindex tags are removed before launching the 301 redirect of the legacy domain but there are a number of competing priorities at play outside of SEO.
Intermediate & Advanced SEO | | LosNomads0 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0 -
Are duplicate links on same page alright?
If I have a homepage with category links, is it alright for those category links to appear in the footer as well, or should you never have duplicate links on one page? Can you please give a reason why as well? Thanks!
Intermediate & Advanced SEO | | dkamen0