Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Too many links on page -- how to fix
-
We are getting reports that there are too many links on most of the pages in one of the sites we manage.
Not just a few too many... 275 (versus <100 that is the target).
The entire site is built with a very heavy global navigation, which contains a lot of links -- so while the users don't see all of that, Google does.
Short of re-architecting the site, can you suggest ways to provide site navigation that don't violate this rule?
-
Dr. Pete has a good post about this warning at http://www.seomoz.org/blog/how-many-links-is-too-many that may help you out.
-
How much damage does this issue really do?
I see many sites in our industry (enterprise software) whose pages are equally heavy with links. They still seem to enjoy a high page rank. Look at Microsoft, IBM, Oracle, Red Hat...
Just wondering how this compares to other issues that would be easier to address.
-
I don't know what you mean that users don't see the links and Google does.
I mean that most of these links are on drop-down menus so they aren't visible unless someone hovers over the tab. My point was that the user experience isn't affected by the presence of all of these links on the page, even though it is a problem from an SEO perspective.
The page in question is: http://www.novell.com/products/groupwise/ -- but since we use the same global navigation across the entire site, every page on the site is getting the same SEOMoz warning about too many links.
-
I do think a navigation restructure is the way to do it and a way to better focus your link juice. No following certain links no longer works for internal site links - Google see's through that now.
Instead of your categories nav including a/or multiple sub cat nav links (im guessing thats how it currently works) on every page, inc the sub cat links on the main cat pages not as part of the main cat nav e.g. dont inc red widget, blue widget, green widget under the main 'widget' navigation, instead add links to red, blue, green widget on the main widget page. Hope that makes sense.
-
You should bucket the categories into 10-20 primary categories. If you have have lower level categories that are potential traffic drivers, you can feature a few select categories along the primary categories. Every link takes away Pagerank from the host page and too many links on every page will devalue the pages. It's hard to give clear directions without seeing the site,
I don't know what you mean that users don't see the links and Google does. Are you hiding links? That may penalize the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shifting target keyword to a new page, how do we rank the internal page?
I have been targeting one keyword for home page that was ranking between the postilion 6-7 but was never ranking on 1st as there were 2 highly competitive keywords targeted on the same page, I changed the keyword to an internal service page to rank it on 1st, I have optimized the content as well but the home page is still ranking on 11th, how do I get the internal page rank on that keyword
On-Page Optimization | Sep 12, 2019, 10:51 AM | GOMO-Gabriel0 -
Page Title Length
Hi Gurus, I understand that it is a good practice is to use 50-60 characters for the a page title length. Google appends my brand name to the end of each title (15 characters including spaces) it index. Do I need to count what google adds as part of the maximum recommended length? i.e.
On-Page Optimization | Sep 4, 2017, 3:14 AM | SunnyMay
is the maximum 50-60 characters + the 15 characters brand name Google adds to the end of the title or 50-60 including the addition? Many thanks!
Lev0 -
Link flow for multiple links to same URL
Hi there,
On-Page Optimization | Mar 31, 2016, 6:06 AM | doctecs
my question is as follows: How does Google handle link flow if two links in a given page point to the same URL? (do they flow link individually or not?) This seems to be a newbie question, but actually it seems that there is little evidence and even also little consensus in the SEO community about this detail. Answers should include source Information about the current state of art at Google is preferable The question is not about anchor text, general best practises for linking, "PageRank is dead" etc. We do know that the "historical" PageRank was implemented (a long time ago) without special handling for multiple links, as e.g. last stated by Matt Cutts in this video: http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 On the other hand, many people from the SEO community say that only the first link counts. But so far I could not find any data to back this up, which is quite surprising.0 -
How do I fix duplicate page issue on Shopify with duplicate products because of collections.
I'm working with a new client with a site built on Shopify. Most of their products appear in four collections. This is creating a duplicate content challenge for us. Can anyone suggest specific code to add to resolve this problem. I'm also interested in other ideas solutions, such as "don't use collections" if that's the best approach. I appreciate your insights. Thank you!
On-Page Optimization | Apr 7, 2015, 7:31 PM | quiltedkoala0 -
301 redirects from several sub-pages to one sub-page
Hi! I have 14 sub-pages i deleted earlier today. But ofcourse Google can still find them, and gives everyone that gives them a go a 404 error. I have come to the understading that this wil hurt the rest of my site, at least as long as Google have them indexed. These sub-pages lies in 3 different folders, and i want to redirect them to a sub-page in a folder number 4. I have already an htaccess file, but i just simply cant get it to work! It is the same file as i use for redirecting trafic from mydomain.no to www.mydomain.no, and i have tried every kind of variation i can think of with the sub-pages. Has anyone perhaps had the same problem before, or for any other reason has the solution, and can help me with how to compose the htaccess file? 🙂 You have to excuse me if i'm using the wrong terms, missing something i should have seen under water while wearing a blindfold, or i am misspelling anything. I am neither very experienced with anything surrounding seo or anything else that has with internet to do, nor am i from an englishspeaking country. Hope someone here can light up my path 🙂 Thats at least something you can say in norwegian...
On-Page Optimization | Feb 23, 2012, 5:57 PM | MarieA1 -
Creating New Pages Versus Improving Existing Pages
What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)?
On-Page Optimization | May 28, 2011, 11:22 PM | SparkplugDigital0 -
Prevent link juice to flow on low-value pages
Hello there! Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages. I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO. What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this? Put a rel="nofollow" attribute on those links? Put a "robots" meta tag containing "noindex,nofollow" on those pages? Put a "Disallow" for those pages in a "robots.txt" file? Use efficient Javascript links? (that crawlers won't be able to follow)
On-Page Optimization | Apr 26, 2011, 4:54 AM | jonigunneweg0 -
Would it be bad to change the canonical URL to the most recent page that has duplicate content, or should we just 301 redirect to the new page?
Is it bad to change the canonical URL in the tag, meaning does it lose it's stats? If we add a new page that may have duplicate content, but we want that page to be indexed over the older pages, should we just change the canonical page or redirect from the original canonical page? Thanks so much! -Amy
On-Page Optimization | Mar 8, 2011, 9:32 PM | MeghanPrudencio0