How do I reduce internal links & cannibalisation from primiary navigation?
-
SEOmoz tools is reporting each page on our site containing in excess of 200 internal links mostly from our primary navigation menu which it says is too many.
This also causes cannibalization on the word towels which i would like to avoid if possible.
Is there a way to reduce the number of internal links whilst maintaining a good structure to allow link juice to filter through the site and also reduce cannibalization?
-
Hi Marcus.
That sounds like a good Strategy. I am going to look at different structure on our dev site and see what works best overall from both technical and more importantly the customers point of view.
I'll definitely have a look at he apps you mentioned. They seem really interesting.
Once again, many thanks for your advise. Really helpful indeed
Regards
Craig
-
Hey Fraser,
taking a quick look again, you have a three tier nav, so you could just simplify it so it is only a two tier nav.
So,
Bed & Bath
- Bath Linen
- Bedroom
Embroidery
- Towels
- Bathrobes
- Kids
- Sports & Spa
The others all seem to just be a single depth anyhow so this could make the main sections of the site easily navigational and you still have your sub navigation which you could then maybe highlight a lttle more in the design.
So, if someone clicked on Towels, we would have the highlighted section and sub nav but if we maybe boxed it and added a header of something like
Browse Towels - Hand Towels
- Bath Towels- etc
My advice would always be to try it, see if it works out for you and improves any metrics you are tracking. Maybe install something like Crazy Egg heatmaps or click tracking so you can see how people are using the current nav (if at all) vs the sub navigation.
Certainly, a sub navigation is a convention, amazon etc use it, so people are used to this way of browsing. Just make sure your design shines a light on what you want people to do and I don't see any usability issues.
Hope this helps!
Marcus -
Hi Marcus,
That's is correct for our site.
I had thought of this and agree its technically the easiest way to deal with this, however I wonder how problematic it would be for the customer if they were trying to navigate say from a product to another category completely, there would be no easy way to do this. I am not convinced that the customer would use the breadcrumb trail to do so.
Maybe there is a way of utilising the primary navigation but limiting the number links from it.
Thanks for your help.
Craig
-
Hey Fraser
Is this your site: www.towelsrus.co.uk?
If so, the simple answer here would be to lose your drop down menus and give that a go. You already have a sub menu on each page once the user clicks through and a breadcrumb so someone can easily anchor themselves within the site should they land on one of these pages.
You then have a sub navigation on the primary category pages that allows people to navigate down to the long tail sub category pages like this one: Egyptain Cotton Hooded Bathrobes - which I am sure does not need to be linked to from every single page.
That should be technically easy enough and will make for a better overall site structure which is still easily spiderable via the sub navs. Equally, the canibalisation is likely not as big an issue as it appears but you will be removing multiple instances of keywords from the nav so that's not going to hurt.
I would give this a read:
http://www.seomoz.org/blog/how-many-links-is-too-manyHope that helps!
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any idea why Google Search Console stopped showing "Internal Links" and "Links to your site"
Our default eCommerce property (https://www.pure-elegance.com) used to show several dozen External Links and several thousand Internal Links on Google Search Console. As of this Friday both those links are showing "No Data Available". I checked other related properties (https://pure-elegance.com, http:pure-elegance.com and http://www.pure-elegance.com) and all of them are showing the same. Our other statistics (like Search Analytics etc.) remain unchanged. Any idea what might have caused this and how to resolve this?
Intermediate & Advanced SEO | | SudipG0 -
Rankings rise after improving internal linking - then drop again
I'm working on a large scale publishing site. I can increase search rankings almost immediately by improving internal linking to targeted pages, sometimes by 40 positions but after a day or two these same rankings drop down again, not always as low as before but significantly lower than their highest position. My theory is that the uplift generated by the internal linking is subsequently mitigated by other algorithmic factors relating to content quality or site performance or is this unlikely? Does anyone else have experience of this phenomenon or any theories?
Intermediate & Advanced SEO | | hjsand1 -
Pros & Cons of Switching Your Main Domain to Mask Links & Combat EMDs
Hello Mozzers, I'd love to receive some advice for a client of mine and insights you may have regarding pros and cons on changing your main domain to mask links. Within a competitive niche there are about 4 different sites that routinely rank 1-4. Our site crushes all three on just about all metrics except we have a high volume of nofollow links and our site remains at #4. Our site is much older so we have significantly more links than these smaller sites, including pre-penguin penalty spammy links (like blog comments that make up 50+ nofollow links from 1 comment per domain). Obviously we are attempting to remove any toxic links and disavow, however the blog comment nofollow links skew our anchor text ratio pretty intensely and we are worried that we aren't going to make a dent in removing this type of links. Just disavowing them hasn't worked alone, so if we are unable to remove the bulk of these poor quality links (nofollow, off-topic anchor text, etc..) we are considering 301 redirecting the current domain to a new one. We've seen success with this in a couple of scenarios, but wanted to see other insights as to if masking links with a 301 could send fresh signals and positively effect rankings. Also wanted to mention, 2 of the 3 competitors that outrank us have EMD's for the primary keywords. Appreciate your time, insights, and advice on this matter.
Intermediate & Advanced SEO | | Leadhub0 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Permanently using 301 for internal link
Hello Folks, Tried going through the 301 answers but could not find any question similar to what I had. The issue we have is we have got a listing page with the products like this: /used-peugeot/used-toyota-corolla As you can see this URL is not really ideal and I want to redirect it to /used-toyota/corolla using mod_rewrite. The redirect will be 301. My concern here is the URL in the listing page won't change to /used-toyota/corolla and hence the 301 will be 'permanently' placed and I was wondering if this will lose some link juice of the 301ed URL. Now with 301 being a 'permanent' redirect one would assume it should not be an issue but I just wanted to be sure that I am correct in assuming so. Thank you for your time.
Intermediate & Advanced SEO | | nirpan0 -
Link Research Tools - Detox Links
Hi, I was doing a little research on my link profile and came across a tool called "LinkRessearchTools.com". I bought a subscription and tried them out. Doing the report they advised a low risk but identified 78 Very High Risk to Deadly (are they venomous?) links, around 5% of total and advised removing them. They also advised of many suspicious and low risk links but these seem to be because they have no knowledge of them so default to a negative it seems. So before I do anything rash and start removing my Deadly links, I was wondering if anyone had a). used them and recommend them b). recommend detoxing removing the deadly links c). would there be any cases in which so called Deadly links being removed cause more problems than solve. Such as maintaining a normal looking profile as everyone would be likely to have bad links etc... (although my thinking may be out on that one...). What do you think? Adam
Intermediate & Advanced SEO | | NaescentAdam0 -
The missing link?
Hello and Welcome Moz friends! Thanks for taking the time to look at my problem. On my website I've optimized our content to match the keywords I have selected for the site. I constantly am Re-writing articles, reading SEOMoz on tips and tricks how to make link juice flow. Yet only one of my keywords ranks decently, the rest never show up. I have the hardest time getting traffic to my site, and sales after that. Maybe I am implementing something incorrectly or there is something I am not doing. www.FrontlineMobility.com If you have any tips or anything to give me I would gladly accept it, any criticism is also appreciated. Thank you Friends, hopefully you can help me.
Intermediate & Advanced SEO | | FrontlineMobility0 -
How Google treat internal links with rel="nofollow"?
Today, I was reading about NoFollow on Wikipedia. Following statement is over my head and not able to understand with proper manner. "Google states that their engine takes "nofollow" literally and does not "follow" the link at all. However, experiments conducted by SEOs show conflicting results. These studies reveal that Google does follow the link, but does not index the linked-to page, unless it was in Google's index already for other reasons (such as other, non-nofollow links that point to the page)." It's all about indexing and ranking for specific keywords for hyperlink text during external links. I aware about that section. It may not generate in relevant result during any keyword on Google web search. But, what about internal links? I have defined rel="nofollow" attribute on too many internal links. I have archive blog post of Randfish with same subject. I read following question over there. Q. Does Google recommend the use of nofollow internally as a positive method for controlling the flow of internal link love? [In 2007] A: Yes – webmasters can feel free to use nofollow internally to help tell Googlebot which pages they want to receive link juice from other pages
Intermediate & Advanced SEO | | CommercePundit
_
(Matt's precise words were: The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity. Plenty of other mechanisms would also work (e.g. a link through a page that is robot.txt'ed out), but nofollow on individual links is simpler for some folks to use. There's no stigma to using nofollow, even on your own internal links; for Google, nofollow'ed links are dropped out of our link graph; we don't even use such links for discovery. By the way, the nofollow meta tag does that same thing, but at a page level.) Matt has given excellent answer on following question. [In 2011] Q: Should internal links use rel="nofollow"? A:Matt said: "I don't know how to make it more concrete than that." I use nofollow for each internal link that points to an internal page that has the meta name="robots" content="noindex" tag. Why should I waste Googlebot's ressources and those of my server if in the end the target must not be indexed? As far as I can say and since years, this does not cause any problems at all. For internal page anchors (links with the hash mark in front like "#top", the answer is "no", of course. I am still using nofollow attributes on my website. So, what is current trend? Will it require to use nofollow attribute for internal pages?0