Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Breadcrumbs and internal links
-
Hello,
I use to move up my site structure with links in content. I have now installed breadcrumbs, is it is useful to still keep the links in content or isn't there a need to duplicate those links ? and are the breadcrumbs links enough.
Thank you,
-
Thanks for your comment Paul
-
Glad to help
-
Thank both for your answers. There are very helpful and all is clear. I know now that it is best to have both.
-
I think Roman's response is thorough and well reasoned. I'm a content strategist (not a designer or developer), so I like the way his answer puts the user front and center. Bottom line: do in-text links and bread crumb links both help users? Yes, depending where you are on the page and how deep the page is. My instinct on bread crumbs is that their especially helpful once you get a couple pages deep in a site and a user might start to get a bit disoriented. My in-text links are often more driven by the content itself, what will provide added value to the user (or potentially SEO value to another page on the site). Hope that's helpful.
-
As I see you have question about duplicated links and the answer depends on your needs let me explain my point.
Why Redundant Links on the Same Page Are a Good Idea. There are many reasons why you might want to show duplicate links on the same page. Here are some common motivations
- Provide safety nets: If people don’t notice the link the first time, maybe they will notice the second occurrence as they scroll the page. The redundancy may minimize individual differences: one person might notice the link at the top, while another person might notice it at the bottom. Showing links in multiple places is thus hypothesized to capture a broader audience.
- Deal with long pages: Having to scroll all the way up to the top of an overly long page is time-consuming. Offering users alternative ways to access links will help alleviate the pain.
- Create visual balance: Empty space is common on top-level (wayfinding) pages, where content might be sparse or nonexistent. Filling in awkward white space with extra copies of links will make the page look more balanced
- **Follow the evidence: **Analytics show that traffic to desired destination pages increase when links to them are duplicated.
Why Redundant Links Are a Bad Idea (Most of the Time)
Redundancy can be good or bad depending on when it’s applied. Each of the explanations above may sound reasonable. However, relying on redundancy too frequently or without careful consideration can turn your site into a navigation quagmire.What’s the big deal about having a few duplicate links on the page?
- Each additional link increases the interaction cost required to process the link because it rises the number of choices people must process. The fewer the choices, the faster the processing time.
- Each additional link depletes users’ attention because it competes with all others. Users only have so much attention to give and often don’t see stuff that’s right on the screen. So when you grab more attention for one link, you lose it for the others: there’s substantial opportunity cost to extra linking.
- Each additional link places an extra load on users’ working memory because it causes people to have to remember whether they have seen the link before or it is a new link. Are the two links the same or different? Users often wonder if there is a difference that they missed. In usability studies, we often observe participants pause and ponder which they should click. The more courageous users click on both links only to be disappointed when they discover that the links lead to the same page. Repetitive links often set user up to fail.
- Extra links waste users’ time whenever users don’t realize that two links lead to the same place: if they click both links, then the second click is wasteful at best. At worst, users also don’t recognize that they’ve already visited the destination page, causing them to waste even more time on a second visit to that page. (Remember that to you, the distinctions between the different pages on your site are obvious. Not so for users: we often see people visit the same page a second time without realizing that they’ve already been there.)
**CONCLUSION **
Sometimes navigation is improved when you have more room to explain it. If this is the case, duplicating important navigational choices in the content area can give you more flexibility to supplement the links with more detailed descriptions to help users better understand the choices.
Providing redundancy on webpages can sometimes help people find their way. However, redundancy increases the interaction cost. Duplicating links is one of the four major dangerous navigation techniquesthat cause cognitive strain. Even if you increase traffic to a specific page by adding redundant links to it, you may lose return traffic to the site from users who are confused and can’t find what they want.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do internal links from non-indexed pages matter?
Hi everybody! Here's my question. After a site migration, a client has seen a big drop in rankings. We're trying to narrow down the issue. It seems that they have lost around 15,000 links following the switch, but these came from pages that were blocked in the robots.txt file. I was wondering if there was any research that has been done on the impact of internal links from no-indexed pages. Would be great to hear your thoughts! Sam
Intermediate & Advanced SEO | | Blink-SEO0 -
SEO Impact of High Volume Vertical and Horizontal Internal Linking
Hello Everyone - I maintain a site with over a million distinct pages of content. Each piece of content can be thought of like a node in graph database or an entity. While there is a bit of natural hierarchy, every single entity can be related to one or more other entities. The conceptual structure of the entities like so: Agency - A top level business unit ( ~100 pages/urls) Office - A lower level business unit, part of an Agency ( ~5,000 pages/urls) Person - Someone who works in one or more Offices ( ~80,000 pages/urls) Project - A thing one or more People is managing ( ~750,000 pages/urls) Vendor - A company that is working on one or more Projects ( ~250,000 pages/urls) Category - A descriptive entity, defining one or more Projects ( ~1,000 pages/urls) Each of these six entities has a unique (url) and content. For each page/url, there are internal links to each of the related entity pages. For example, if a user is looking at a Project page/url, there will be an internal link to one or more Agencies, Offices, People, Vendors, and Categories. Also, a Project will have links to similar Projects. This same theory holds true for all other entities as well. People pages link to their related Agencies, Offices, Projects, Vendors, etc, etc. If you start to do the math, there are tons of internal links leading to pages with tons of internal links leading to pages with tons of internal links. While our users enjoy the ability to navigate this world according to these relationships, I am curious if we should force a more strict hierarchy for SEO purposes. Essentially, does it make sense to "nofollow" all of the horizontal internal links for a given entity page/url? For search engine indexing purposes, we have legit sitemaps that give a simple vertical hierarchy...but I am curious if all of this internal linking should be hidden via nofollow...? Thanks in advance!
Intermediate & Advanced SEO | | jhariani2 -
Internal links and URL shortners
Hi guys, what are your thoughts using bit.ly links as internal links on blog posts of a website? Some posts have 4/5 bit.ly links going to other pages of our website (noindexed pages). I have nofollowed them so no seo value is lost, also the links are going to noindexed pages so no need to pass seo value directly. However what are your thoughts on how Google will see internal links which have essential become re-direct links? They are bit.ly links going to result pages basically. Am I also to assume the tracking for internal links would also be better using google analytics functionality? is bit.ly accurate for tracking clicks? Any advice much appreciated, I just wanted to double check this.
Intermediate & Advanced SEO | | pauledwards0 -
Links from non-indexed pages
Whilst looking for link opportunities, I have noticed that the website has a few profiles from suppliers or accredited organisations. However, a search form is required to access these pages and when I type cache:"webpage.com" the page is showing up as non-indexed. These are good websites, not spammy directory sites, but is it worth trying to get Google to index the pages? If so, what is the best method to use?
Intermediate & Advanced SEO | | maxweb0 -
Link Research Tools - Detox Links
Hi, I was doing a little research on my link profile and came across a tool called "LinkRessearchTools.com". I bought a subscription and tried them out. Doing the report they advised a low risk but identified 78 Very High Risk to Deadly (are they venomous?) links, around 5% of total and advised removing them. They also advised of many suspicious and low risk links but these seem to be because they have no knowledge of them so default to a negative it seems. So before I do anything rash and start removing my Deadly links, I was wondering if anyone had a). used them and recommend them b). recommend detoxing removing the deadly links c). would there be any cases in which so called Deadly links being removed cause more problems than solve. Such as maintaining a normal looking profile as everyone would be likely to have bad links etc... (although my thinking may be out on that one...). What do you think? Adam
Intermediate & Advanced SEO | | NaescentAdam0 -
How to detect a bad neighborhood links?
I have the feeling that I am suffering from negative seo, so there is a way to get a list of links that should remove in the google disavow links tool ?
Intermediate & Advanced SEO | | Valarlf0 -
How Google treat internal links with rel="nofollow"?
Today, I was reading about NoFollow on Wikipedia. Following statement is over my head and not able to understand with proper manner. "Google states that their engine takes "nofollow" literally and does not "follow" the link at all. However, experiments conducted by SEOs show conflicting results. These studies reveal that Google does follow the link, but does not index the linked-to page, unless it was in Google's index already for other reasons (such as other, non-nofollow links that point to the page)." It's all about indexing and ranking for specific keywords for hyperlink text during external links. I aware about that section. It may not generate in relevant result during any keyword on Google web search. But, what about internal links? I have defined rel="nofollow" attribute on too many internal links. I have archive blog post of Randfish with same subject. I read following question over there. Q. Does Google recommend the use of nofollow internally as a positive method for controlling the flow of internal link love? [In 2007] A: Yes – webmasters can feel free to use nofollow internally to help tell Googlebot which pages they want to receive link juice from other pages
Intermediate & Advanced SEO | | CommercePundit
_
(Matt's precise words were: The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity. Plenty of other mechanisms would also work (e.g. a link through a page that is robot.txt'ed out), but nofollow on individual links is simpler for some folks to use. There's no stigma to using nofollow, even on your own internal links; for Google, nofollow'ed links are dropped out of our link graph; we don't even use such links for discovery. By the way, the nofollow meta tag does that same thing, but at a page level.) Matt has given excellent answer on following question. [In 2011] Q: Should internal links use rel="nofollow"? A:Matt said: "I don't know how to make it more concrete than that." I use nofollow for each internal link that points to an internal page that has the meta name="robots" content="noindex" tag. Why should I waste Googlebot's ressources and those of my server if in the end the target must not be indexed? As far as I can say and since years, this does not cause any problems at all. For internal page anchors (links with the hash mark in front like "#top", the answer is "no", of course. I am still using nofollow attributes on my website. So, what is current trend? Will it require to use nofollow attribute for internal pages?0 -
Canonical Tag and Affiliate Links
Hi! I am not very familiar with the canonical tag. The thing is that we are getting traffic and links from affiliates. The affiliates links add something like this to the code of our URL: www.mydomain.com/category/product-page?afl=XXXXXX At this moment we have almost 2,000 pages indexed with that code at the end of the URL. So they are all duplicated. My other concern is that I don't know if those affilate links are giving us some link juice or not. I mean, if an original product page has 30 links and the affiliates copies have 15 more... are all those links being counted together by Google? Or are we losing all the juice from the affiliates? Can I fix all this with the canonical tag? Thanks!
Intermediate & Advanced SEO | | jorgediaz0