Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Link flow for multiple links to same URL
-
Hi there,
my question is as follows:How does Google handle link flow if two links in a given page point to the same URL? (do they flow link individually or not?)
This seems to be a newbie question, but actually it seems that there is little evidence and even also little consensus in the SEO community about this detail.
- Answers should include source
- Information about the current state of art at Google is preferable
- The question is not about anchor text, general best practises for linking, "PageRank is dead" etc.
We do know that the "historical" PageRank was implemented (a long time ago) without special handling for multiple links, as e.g. last stated by Matt Cutts in this video: http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718
On the other hand, many people from the SEO community say that only the first link counts. But so far I could not find any data to back this up, which is quite surprising.
-
I totally agree on the focus thing in general - it's not helpful to act with PageRank in mind when it comes to layout decisions etc.
But: For large websites (e.g. 100,000 pages and up) crawl rate, indexing and rankings of deeper parts of the site depend heavily on the internal link graph. Taking a deeper look at the internal link graph gives us a lot of useful information in these cases, does it?
Now: Think of links sitting in a template that gets used on 50,000 pages. A little change here is likely to cause quite a difference in the internal link graph.
For example I've run PageRank simulations with both models on a smaller website with only 1,500 pages / 100,000 links. For many pages, the little difference ends up with 20-30% more or less internal PageRank - for these individual pages, this could be crucial for crawling, indexation and rankings. Still not useful?
Since moz runs it's own iterative PR like algorithms: How do you guys handle this with mozRank / mozTrust? Which model leads to better correlations with rankings?
-
- The links both get PageRank flow...
- The link value gets divided, though, so it wouldn't exactly double the value.
- The link extraction process might choose to only select one link from the page based on certain factors (perhaps ignoring some links not because they are duplicative but based on location, or other qualifiers)
Here is Matt Cutts talking about this very issue. And here again. It is the closest thing we have to an answer.
I think the reason for the "first link counts" is really an extension of an understanding of PageRank. Let's say a page has 1 outbound link. It gets 100% of the value passable by that page. Now, let's say the page adds another link, but it is the exact same link. Now, each link gets 50%. The sum total is 100%. It is as if the 2nd link were never added. But, this calculation changes depending on the other links on the page. Let's say a page has 2 links on it. One to you, one to someone else. 50/50. If you get another, you jump to 67/33. Slightly better. As the page increases in number of links, your additional link approaches a doubling of the first link's value. So on one end of the spectrum it is valueless. On the other end of the spectrum it doubles.
The other question is whether anchor text is counted for all links. Some experimentation indicates that only the 1st anchor text matters. This might also indicate the selection / extraction process mentioned in #2.
That all being said, I think I agree with Matt Cutts on this one. This is such a small issue that you really should focus on bigger picture stuff. It is interesting, yes, but not particularly useful.
I hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Virtual URL Google not indexing?
Dear all, We have two URLs: The main URL which is crawled both by GSC and where Moz assigns our keywords is: https://andipaeditions.com/banksy/ The second one is called a virtual url by our developpers: https://andipaeditions.com/banksy/signedandunsignedprintsforsale/ This is currently not indexed by Google. We have been linking to the second URL and I am unable to see if this is passing juice/anything on to the main one /banksy/ Is it a canonical? The /banksy/ is the one that is being picked up in serps/by Moz and worry that the two similar URLs are splitting the signal. Should I redirect from the second to the first? Thank you
On-Page Optimization | Jul 22, 2023, 7:53 AM | TAT1000 -
What to do to index all my links of my website?
Ok, i have a new website, with only 14.000 page indexed by google, but the potential is big, 1-2 million pages. What i have to do, to force somehow google to index my website faster? This is my website: https://vmag.ro/
On-Page Optimization | Jun 11, 2019, 1:53 PM | TeodorMarin0 -
Too much internal linking?
Hi everyone, Too much of anything is not good. In terms of internal linking, how many are too many? I read that the recommended internal links are about 100 links per page otherwise it dilutes the page's link equity. I have a concern about one of our websites - according to search console, the homepage has 923 internal links. All the pages have a corresponding /feed page added to the page URL, which is really weird (is this caused by a plugin?). The site also has an e-com feature, but it is not used as the site is essentially a brochure and customers are encouraged to visit the shop. I assume the e-com feature also increases this number. On the other hand, one of the competitors we are tracking has 1 internal link site-wide. Ours is at 45,000 site-wide. How is it possible to only have 1 internal link? Is this a Moz bug? I know we also need to reduce our internal links badly, however, I'm not sure where to start. I don't know how these internal links are linked together - some aren't in the copy or navigation menu. When I scan the homepage links using 'check my links', the total links identified for the homepage is only 170. kAeYlTM
On-Page Optimization | Sep 14, 2018, 3:46 AM | nhhernandez0 -
Should I use an acronym in my URL?
I know that Google understands various acronyms. Example: If I search for CRM System, it knows i'm searching for a customer relationship management system. However, will it recognize less known acronyms? I have a page geared specifically for SAP data archiving for human capital management systems. For those in the industry, they simply call it HCM. Here is how I view my options: Option #1: www.mywebsite.com/sap-data-archiving/human-capital-management Option #2: www.mywebsite.com/sap-data-archiving/hcm Option #3: www.mywebsite.com/sap-data-archiving/hcm-human-capital-management With option #3, i'm capturing the acronym AND the full phrase. This doesn't make my URL overly long either. Of course, in my content i'll reference both. What does everyone else think about the URL? -Alex
On-Page Optimization | Jul 2, 2015, 11:27 AM | MeasureEverything0 -
Multiple domains for the same business
My client purchased over 500 URLs for targeting various customers and ranking for different keywords. It is for the same business though. What is the best strategy to deal with this kind of approach in your opinion. They use different meta data for each of the URLs starting with brand name in meta title. Are there any other points to keep in mind when developing strategy for all those URLs. Is this a good approach?
On-Page Optimization | Jul 22, 2014, 7:12 PM | alicaomisem1 -
What to do with multiple forms and thank you pages
Hi Everyone, I'm wondering what to do with form and thank you pages. I asked a question a long time ago about the contact page as noindex and was told by people that its better to leave it and write content for it and thats what we did. Now I have a client that does self storage and they have 4 locations and each location has a reservation page with a basic form but no content. Each page also redirects to a thank you page with tracking codes. There's a total of 6 "thank you" pages with different codes (this was done by yellow book). 4 "reserve your storage pages", 2 pages to pay for storage with iframes to 3rd party payment portals. I was told to noindex these pages but I'm not sure so I'm asking here. I was also told to nofollow and remove them from sitemaps. Thanks Aron
On-Page Optimization | Jul 10, 2014, 12:04 AM | aronwp0 -
Url structure with dash or slash
Hi There We have a content website. We don't rank well category image related searches but we get quite good traffic for those keywords. Those keywords are mostly like "category images". We want to change our url structure and we have 2 options now. 1- domain.com/category/category-images 2-domain.com/category/images option 1 repeats the category name so it looks spammy option 2 doesn't really have the keyword. any ideas which one tho choose? Thanks! ps: we don't want to use domain.com/category-images (too many root link)
On-Page Optimization | Feb 10, 2014, 3:54 PM | muminaydin0 -
Prevent link juice to flow on low-value pages
Hello there! Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages. I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO. What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this? Put a rel="nofollow" attribute on those links? Put a "robots" meta tag containing "noindex,nofollow" on those pages? Put a "Disallow" for those pages in a "robots.txt" file? Use efficient Javascript links? (that crawlers won't be able to follow)
On-Page Optimization | Apr 26, 2011, 4:54 AM | jonigunneweg0