"Too many on page links" phantom penalty? What about big sites?
-
So I am consistently over the recommended "100 links" rule on our site's pages because of our extensive navigation and plentiful footer links (somewhere around 300 links per page). I know that there is no official penalty for this but rather that it affects the "link juice" of each link on there. I guess my question is more about how places like Zappos and Amazon get away with this? They have WAY over 100 links per page... in fact I think that Zappos footer is 100+ links alone. This overage doesn't seem to affect their domain rankings and authority so why does SEO moz place so much emphasis on this error?
-
I can totally agree with that statement. Perhaps I misspoke. Im not asking for them to set my guidelines but rather just noting that they are pretty certain about drawing that line at 100 with their error reporting as opposed to keeping that line fuzzy like it really is.
-
I cannot speak for SEOMoz, but I personally think one of the reasons they don't tell you what a good range to have is because a lot of it comes down to your site structure as well as authority. Telling people how many they can have based on domain authority, and knowing nothing of site structure, may lead to bad practices.
Like Marcus says above:
"Now, only you can determine what a reasonable amount is for your site. If all pages have tons of authority, then great, but if not, you may want to rethink your navigation. "
-
Thanks for the reply. I guess then my question is... why doesn't SEO moz give you a range of links appropriate for your domain authority? So for example if my domain authority is 35 what range of links would be appropriate... if its 65, likewise what amount of links would be considered permissible? The reason i say range is because i know its not a hard fast rule. Its just hard to see thousands of errors glaring at you every day when in fact they may not be affecting your domain authority like they say.
-
This was a rule from back in the day when Google's advice was not to have more than 100 links per page. The thing is, whilst this exact rule is not really the case now, there is a lot of good reason not to have a link to every page on every page.
If you have this kind of huge navigation
- is it good for users? Will they really navigate through 300 odd links?
- do you really want to evenly distribute page rank across all pages in the site and indicate that each page is equally important?
Also, if all links are not getting crawled, then there is a potential for page rank distribution to be messed up with certain pages not getting anything due to the behemoth navigation.
It's important to note though, this has never been a penalty issue, it is really just a bad internal SEO (and possibly usability) issue.
This is an interesting read:
http://www.seomoz.org/blog/how-many-links-is-too-manyAlso, you can see Google still recommend that you 'keep the links on a given page to a reasonable amount':
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769#1Now, only you can determine what a reasonable amount is for your site. If all pages have tons of authority, then great, but if not, you may want to rethink your navigation.
Hope this helps
Marcus -
The 100 links thing is more of a general recommendation than a "rule" . The answer could be different from site to site. Back in the day, this was a limit of 100 links crawled per page because Google put a cap on how much they would crawl in order to save some bandwidth.
In current times, this is more about how much domain authority your site has. If you have a lot of links, those are all going to be leeching your PR and they will all get less "juice" as the amount that flows over will keep getting divided.
Sites like Amazon and Zappos have a lot of authority and PR, therefore this is not as big of an issue to them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with auto generated pages on our site that are considered thin content
Hi there, Wondering how to deal w/ about 300+ pages on our site that are autogenerated & considered thin content. Here is an example of those pages: https://app.cobalt.io/ninp0 The pages are auto generated when a new security researcher joins our team & then filled by each researcher with specifics about their personal experience. Additionally, there is a fair amount of dynamic content on these pages that updates with certain activities. These pages are also getting marked as not having a canonical tag on them, however, they are technically different pages just w/ very similar elements. I'm not sure I would want to put a canonical tag on them as some of them have a decent page authority & I think could be contributing to our overall SEO health. Any ideas on how I should deal w/ this group of similar but not identical pages?
Moz Pro | | ChrissyOck0 -
How do I find out which pages are being indexed on my site and which are not?
Hi, I doing my first technical audit on my site. I am learning how to do an audit as i go and am a lost. I know some page won't be indexed but how do I: 1. Check the site for all pages, both indexed and not indexed 2. Run a report to show indexed pages only (i am presuming i can do this via screaming Frog or webmaster tool) 3. I can do a comparison between the two list and work out which pages are not being indexed. I'll then need to figure out way. I'll cross this bridge once i get to it Thanks Ben
Moz Pro | | benjmoz0 -
Have a Campaign, but only states 1 page has been crawled by SEOmoz bots. What needs to be done to have all the pages crawled?
We have a campaign running for a client in SEOmoz and only 1 page has been crawled per SEOmoz' data. There are many pages in the site and a new blog with more and more articles posted each month, yet Moz is not crawling anything, aside from maybe the Home page. The odd thing is, Moz is reporting more data on all the other inner pages though for errors, duplicate content, etc... What should we do so all the pages get crawled by Moz? I don't want to delete and start over as we followed all the steps properly when setting up. Thank you for any tips here.
Moz Pro | | WhiteboardCreations0 -
I have a client with a bit over 100 inbound links but Open Site Explorer shows total links on Subdomain as over 66,000, how can this be?
I have a client www.woodard247.com that shows only a bit over 100 inbound links. using the Open Site Explorer, they are a well trafficed local site and are not doing any grey hat or black hat techniques. However the rankings for their main keywords have suffered recently and we cannot identify any duplicate content or keyword stuffing issues. We have never purchased links or used link building software. Now for the main question: When we run Open Site Explorer it shows only a bit over 100 inbound links on the page and the domain, but the subdomain shows over 66,000 total links! How can this be possible? Could this be a problem? How can I find out what these links are since Open Site Explorer and Seo Spyglass both show only a hundred or so?
Moz Pro | | tjkirgin0 -
How accurate is the MozBar "Followed Link Highlighter"
I'm currently doing some research on blogs to guest post on and I want to make sure they are providing DoFollow links out. are the green links displayed from the MozBar simply links that don't contain the HTML element "NOFOLLOW" or is it more in depth tan that? according to this article there are TONS of different ways that people can place a NOFOLLOW link on their site http://www.searchenginepeople.com/blog/identify-nofollow-juiceless-links-guide.html Thanks
Moz Pro | | SheffieldMarketing0 -
Open Site Explorer Says I have 1,627 inbound links, but only downloads 275\. Am I missing something?
Is there a feature i'm missing out on here? I want to download the complete link profile, but I only see 275 of 1,627 links when I export to CSV. Any guidance would be appreciated!
Moz Pro | | lipoweb0 -
Truncate page URLs
We have some pages (for example a contact us form) for which the URL is modified by the CMS depending on the referring page (this helps to put the form submission in context for the sales reps who get the contact submission). The SEOmoz crawler considers each URL a new page -- and so numbers like in diagnostics are all inflated as the same page is listed multiple times (e.g. for too many links) Is there a setting to change what the crawler considers to be the same page? Here are two URLs for the same page that the reports treat as separate pages: http://www.spirent.com/About-Us/Contact_us.aspx?referurl=0F528F4D703D8BB3523738D6373AA8AD http://www.spirent.com/About-Us/Contact_us.aspx?referurl=10ACDA6055244E369395223437FDCF30 The page is actually: http://www.spirent.com/About-Us/Contact_us.aspx Thanks Ken
Moz Pro | | spirent.marcom0 -
Too Many On-Page Links
The SeoMoz site crawler says all my pages have too many links. I am using Dreamweaver with a horizontal Spry drop-down menu bar. My site has several hundred pages and about 100 of them show up in this Spry menu bar. I believe that this would be considered a false positive for too many links - am I right? Or is Google seeing this also as too many links per page? I am trying to get my Google rankings back after being hurt badly by the Penguin. I am using php but don't see another way to do the site links without going to a CMS type site. Thanks for any help you can give.
Moz Pro | | johnsearles0