Drop Down Menu - Link Juice Depletion
-
Hi,
We have a site with 7 top level sections all of which contain a large number of subsections which may then contain further sub sections.
To try and ensure the best user experience we have a top navigation with the 7 top level sections and when hovered a selection of the key sub sections.
Although I like this format for the user as it makes it easier for them to find the most important sections / sub sections it does lead to a lot of links within every page on the site. In general each top section has a drop down with approx 10 - 15 subsections.
This has therefore lead to SeoMoz's tools issuing its too many internal links warning. Then alongside this I am left wondering if I shouldn’t have to many links to my subsections and whether I would be better off being more selective of when I link to them. For instance I could choose the top 5 sub sections and place a link to them from our homepage and by doing so I would be passing a greater amount of link juice down the line.
So I guess my dilemma is between ensuring the user has as easy a time traversing the site as possible whilst I try to keep a close watch on where, and how, our link juice is distributed.
One solution I am considering is whether no-follow links could be utilised within the drop down menus? This way I could then have the desired user navigation and I would be in greater control of what pages link to which sub sections. Would that even work?
Any advice would be greatly appreciated,
Regards,
Guy
-
I work for a company that closely mirrors the dropdowns employed by a very large ecommerce site - http://www.surlatable.com/.
Upon analysis of this URL, there are well over 100 links on that page because of the way their dropdowns are designed (over 400 in fact). However, upon close analysis, only 52 of these internal links are followed, and 370 internal links are not followed.
Based on this, I would recommend if you have well in excess of 150 or so links, use nofollow for those less important sub categories like this site has. (I'll have to change the structure of our site now...)
-
Really appreciate the informative reply. Like you I'm currently the SEO for an ecommerce store which has so many variations and sections that it can be a real headache. We've made good progress flattening the sections and I think from what you said, and a Matt Cutts article I just read on the subject, I'll remove the nofollows, leave Google to it, and take things from there.
-
This is an interesting question. It also goes with what to do with a mega menu and all its links. I've wondered whether the SE can understand that this is in fact a navigation (you would think they would) for internal links and not penalize your links in the body of the of the page.
According to Google's John Mueller when discussing about HTML5 he stated the following:
"In general, our crawlers are used to not being able to parse all HTML markup - be it from broken HTML, embedded XML content or from the new HTML5 tags. Our general strategy is to wait to see how content is marked up on the web in practice and to adapt to that ..." http://goo.gl/0YehV
You would then believe from that statement that the SE can differentiate navigation menus and drop down list from links in the body? I mean the SE must have crawled zillions of page and that would be a natural conclusion?
That being said I've use two strategies; the first embedding the select options in javascript... something like this
`**<script type="text/javascript"> /* "); document.write("<option values=" ">Select a Property...<\/option>"); document.write("</option><optgroup label="Beach Estates">");</optgroup>** document.write("<option value="\/alii-estate\/">Alii Estate<\/option>");** ......** **......**</option>` **This seems to work well.... but not sure if it is actually crawled** The other strategy that I am more in favor with is to position the drop down list or navigation with a position:absolute; and then place them physically at the bottom of the page ... this seems a better way, but it can affect the site links. I've not done any real testing on this. Burt Gordon
-
Hi Guy. In terms of no-follow for page rank sculpting purposes, I've read the pros and cons of both and for me I've concluded I'd rather direct the juice where I want it to go rather than to block or prevent it from flowing where I don't want it to flow. No-follow can have unintended results, so I prefer the alternative.
Volume of categories and how to structure them is a challenge for a lot of ecommerce folks (me included). I've recently started flattening my site. While development of useful and intuitive sub-categories helps people find what they want on the 3rd or 4th click, crawl penetration suffers due to the depth. By flattening my site I mean reducing the number of sub-categories that can only be reached by other sub-categories - which is basically moving 3rd or 4th level categories up to the second level or top level (left nav).
A large and top ranking Toy Store I visit often to see how they structure their links has a top nav with categories, a left nav with categories and a sitemap in the footer. Each navigation entry has either different links in it or some different anchor text linking to the same pages. After much reading and apparent consensus among veteran users in this forum, I nixed the sitemap as unnecessary if I use good linking practice throughout the site. One Guru even suggested a sitemap can hurt your rankings if every page is linked to every other page with juice diminishing returns.
In my case, I created a left nav link to additional categories and put categories or sub-categories in them that were either: 1. Removed from the left nav because they were not important enough to be on the left nav 2. Removed from the left nav because on-page analytics suggested they didn't warrant being on the homepage. 3. Were a 3rd or 4th level category that on-page analytics showed there was enough demand to move its link to a second level or top level.
I hope this works for me and could of some help to you. Good luck.
-
Thank you for the link, I had a read and I've also been making the nofollow adjustments I suggested above.
We have tried to break down the menus into simple, managable chunks. Therefore we are really only linking to important categories. That said we can obviously deem some to be more important to us than others. As such i've employed nofollow tags within the menu on the links which won't generate as much ROI.
Is there any problem with having a nofollow to a certain page within our menu, and then a followed link to that same page within the main page content?
-
10-15 dropdown links per tab is a lot to fit on the screen, but in my opinion the "too many links on the page" error is a bit overdone. How many total links appear on your pages on average? Unless you're blowing way past the general rule of thumb of 100 links, you're ok.
E.G. if most of your pages have 100-120 or less then don't worry about it. If most of your pages have upwards of 150+ links then definitely reassess how useful each link is to the user, and consider cutting down.
Here's an in-depth answer by Dr. Pete: http://www.seomoz.org/blog/how-many-links-is-too-many
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rankings Dropped to Nothing
We're kind of in crisis mode, as our ad revenue is about to take a huge hit. Hoping someone can help me figure out what to do next. Site: https://indoorgardening.com Here's what I did (below) that I think broke things somehow. I'm clearly not an SEO expert but thought I was making things better. And things did improve over the last week or so but then fell apart 2 days ago. 1. Most posts did not have a Yoast focus keyword. I added keyword phrases and used Yoast suggestions to optimize for that and for readability. 2. In some cases I changed post titles to better reflect the keyword phrase. 3. In some cases I changed the slug per Yoast's suggestion and did a 301 redirect from the old slug to the new one. 4. I used Grammarly to fix all spelling, grammar, punctuation, etc. 5. Some Yoast-suggested changes that were made: Image alt tags, subheading structures, adding keyword to subheadings, first paragraph, and meta description, changed sentence length for those over 20 words to clean up the text, added transition words where applicable, reworded passive voice sentences, added internal links when needed, eliminated consecutive sentences (first word), improved Flesch reading ease when necessary. 6. I also added or changed Amazon affiliate links where needed and swapped out images when necessary. Results:
Technical SEO | | Jbyron
I started this project about 3 weeks ago. On 11/29 we had one of our highest traffic days, with 1017 hits coming from Google. On 11/30, 257 hits came from Google, and on 12/1, 3 (three!!) hits came from Google. At this point, 82 of 89 posts have a double green "Good" score in Yoast; 6 are "OK" and 1 does not have a focus keyword designated. Thanks in advance for any help anyone can provide. -John0 -
Are sitewide links bad for SEO?
I have 11 real estate sites and have had links from one to another for about 7 years but someone just suggested me to take them all out because I might get penalized or affected by penguin. My main site was affected on July of 2012 and organic visits have dropped 43%...I've been working on many aspects of my SEO but it's been difficult to come back. Any suggestions are very welcome, thanks 🙂
Technical SEO | | mbulox0 -
Self-referencing links
I personally think that self-referencing links are silly. It's blatantly easy for Google to tell and my instinct says that the link juice for this would simply evaporate rather than passing back to itself. Does anyone have information backing me up from an authoritative source? I can't find any info about this linked to Matt Cutts, Rand or any of those I look up to.
Technical SEO | | IPROdigital0 -
We registered with Yahoo Directory. Why won't this show up as a a linking root domain in our link analysis??
Recently checked our link analysis report for 2 of our campaigns who are registered in the dir.yahoo.com (yahoo directory). For some reason, we don't see this being a domain that shows up as linking to our website - why is this?
Technical SEO | | MMP0 -
Link Juice passing through a redirect of a disallowed URL
Hey guys! Suppose I disallow search bots from indexing anything on my secure server in my robots.txt, and 301 redirect all of my secure server traffic to my non-secure site. Will the search bots see the redirect before they realize that they're disallowed from accessing that page? Or will they see that page is disallowed and not follow the redirect? Should I change my robots.txt to allow search bots to crawl my secure site so they can find the redirects?
Technical SEO | | john4math0 -
No. of links on a page
Is it true that If there is a huge number of links from the source page then each link will provide very little value in terms of passing link juice ?
Technical SEO | | seoug_20050 -
PageRank Dropped?
The Symptoms About a year ago, our site EZWatch-Security-Cameras.com had a PageRank of 5. Several months ago it sunk to a 4 and we were a little worried, but it wasn't anything to really sweat over. At the end of january we noticed it had dropped again to a PR3, again we were a little more worried. When the farmer update hit we suddenly dropped to a PR1 but our traffic wasn't seriously affected, and in march most of the pages regained their PageRank. I noticed this morning that our homepage rank has once again dropped to a PR1. I am waiting to see if there has been any significant drop in traffic, but I haven't spotted anything that stands out significant, aside from an increase in the average cost for our paid search account of about 5%. The Problems We've Spotted Keep in mind that our current website is fairly old (2005) and we are ready to launch a new one. Our current website is running on X-Cart, and we have a few modules added on. Problem 1 - One such module handles a custom kit builder, this area has not been restricted by crawlers and it could be generating a large amount of needless page crawls. Problem 2 - Another module allows "SEO friendly URL's" according to the developer, but what actually happens is a visitor could type in any-url-they-like-for-product-id**-p-11111.html**, where the underlined section is any character string (or lack of), followed by either a product or category indicator and the id for said item. This causes a massive amount of virtual page duplications, and the module is encrypted so we aren't able to modify it to include rel="canonical" tags. Obviously this causes massive amounts of seemingly duplicate content. Problem 3 - In addition to the regular URL duplication, we also recently acquired the domain EZWatch.com (our brand name, easier to remember). That domain name responds with the content from our regular website, and it will be the primary domain name when we change shopping carts. With the second domain name the content could also be considered a duplication. The Solutions We're Working On The website we use was designed in 2005, and we believe that it's reached the end of its useful life. Over the past several months we have been working on an entirely new shopping cart platform, designed from the ground up to be more efficient operationally-speaking, and to provide more SEO control. The new site will be ready to launch within days, and we will start using the new Domain name at the same time. We are planning on doing page-to-page301 redirects for all pages with at least 1 visit within the past 180+ days, according to our Google Analytics reports. We are also including rel="canonical" on all pages. We will also be restricting dynamic sections of our website via the robots.txt file. So What More Can We Do? With your collective SEO experience, what other factors could also be contributing to this decline?
Technical SEO | | EZWatchPro0