Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Too many on page links
-
Hi
I know previously it was recommended to stick to under 100 links on the page, but I've run a crawl and mine are over this now with 130+
How important is this now? I've read a few articles to say it's not as crucial as before.
Thanks!
-
Hi Becky!
First, I would like to say this is it great you are being proactive in making sure your webpage doesn't have too many links on it! But, luckily for you, this is not something you need to worry about. 100 is a suggested number but not something that will penalize you if you go over.
Google’s Matt Cutts posted a video explaining why Google no longer has that 100-links-per-page Webmaster guideline—so be sure to check that out! It's commonly thought that having too many links will negatively impact your SEO results, but that hasn't been the case since 2008. However, Google has said if a site looks to be spammy and has way too many links on a single page—Google reserves the right to take action on the site. So, don't include links that could be seen as spammy and you should be fine.
Check out this Moz blog that discusses how many links is too many for more information!
-
Thank you for the advice, I'll take a look at the articles

Brilliant, the round table sounds great - I'll sign up for this
-
I honestly wouldn't worry Becky. The page looks fine, the links look fine and it is certainly not what you would call spammy,
Link crafting was a 'thing' a number of years ago, but today Google pretty much ignores this, as has been shown many times in testing.
However, you can benefit from internal links, but that is a different discussion. Read this if you are interested.
If you are interested, there is a round-table discussion on eCommerce SEO hosted by SEMrush on Thursday and that could be useful to you? Myself and 2 others will be talking on a number of issues.
-Andy
-
Thanks for the advice, I've looked into this before.
We have menu links and product links as it's an ecommerce site, so I wouldn't be able to remove any of these.
I've found it hard to find a way to decrease these links further on primary pages. For example http://www.key.co.uk/en/key/aluminium-sack-truck has 130 links.
Any advice would be appreciated

-
Confirmation from Google here to limit the links on a page to 3000
https://www.deepcrawl.com/knowledge/news/google-webmaster-hangout-notes-friday-8th-july-2016/
I would consider that to be a lot though

-Andy
-
Brilliant thank you!
-
In the "old days" (yup, I go back that far), Google's search index crawler wasn't all that powerful. So it would ration itself on each page and simply quit trying to process all the content on the page after a certain number of links and certain character count. (That's also why it used to be VERY important that your content was close to the top of your page code, not buried at the bottom of the code).
The crawler has been beefed up to the point where this hasn't been a limiting factor per page for a long time, so the crawler will traverse pretty well any links you feed it. But I +1 both Andy and Mike's advice about considering the usability and link power dilution of having extensive numbers of links on a page. (This is especially important to consider for your site's primary pages, since one of their main jobs is to help flow their ranking authority down to important/valuable second-level pages.)
Paul
-
Hi Becky,
Beyond the hypothetical limit, would be the consideration of dividing the link authority of the page by a really large number of links and therefor decreasing the relative value of each of those links to the pages they link to.
Depending on the page holding all these links, user experience, purpose of linked-to pages, etcetera, this may or may not be a consideration, but worth thinking about.
Good luck!
- Mike
-
Hi Becky,
If the links are justified, don't worry. I have clients with 3-400 and no problems with their positions in Google.
That doesn't mean to say it will be the same case for everyone though - each site is different and sometimes you can have too many, but just think it through and if you come to the conclusion that most of the links aren't needed and are stuffing keywords in, then look to make changes.
But on the whole, it doesn't sound like an issue to me - there are no hard and fast rules around this.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Avoid Too Many Internal Links" when you have a mega menu
Using the on-page grader and whilst further investigating internal linking, I'm concerned that as the ecommerce website has a very link heavy mega menu the rule of 100 may be impeding on the contextual links we're creating. Clearly we don't want to no-follow our entire menu. Should we consider no-indexing the third-level- for example short sleeve shirts here... Clothing > Shirts > Short Sleeve Shirts What about other pages we're don't care to index anyway such as the 'login page' the 'cart' the search button? Any thoughts appreciated.
Intermediate & Advanced SEO | | Ant-Scarborough0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
Do search engines crawl links on 404 pages?
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
Intermediate & Advanced SEO | | brad-causes0 -
Link Research Tools - Detox Links
Hi, I was doing a little research on my link profile and came across a tool called "LinkRessearchTools.com". I bought a subscription and tried them out. Doing the report they advised a low risk but identified 78 Very High Risk to Deadly (are they venomous?) links, around 5% of total and advised removing them. They also advised of many suspicious and low risk links but these seem to be because they have no knowledge of them so default to a negative it seems. So before I do anything rash and start removing my Deadly links, I was wondering if anyone had a). used them and recommend them b). recommend detoxing removing the deadly links c). would there be any cases in which so called Deadly links being removed cause more problems than solve. Such as maintaining a normal looking profile as everyone would be likely to have bad links etc... (although my thinking may be out on that one...). What do you think? Adam
Intermediate & Advanced SEO | | NaescentAdam0 -
Links from new sites with no link juice
Hi Guys, Do backlinks from a bunch of new sites pass any value to our site? I've heard a lot from some "SEO experts" say that it is an effective link building strategy to build a bunch of new sites and link them to our main site. I highly doubt that... To me, a new site is a new site, which means it won't have any backlinks in the beginning (most likely), so a backlink from this site won't pass too much link juice. Right? In my humble opinion this is not a good strategy any more...if you build new sites for the sake of getting links. This is just wrong. But, if you do have some unique content and you want to share with others on that particular topic, then you can definitely create a blog and write content and start getting links. And over time, the domain authority will increase, then a backlink from this site will become more valuable? I am not a SEO expert myself, so I am eager to hear your thoughts. Thanks.
Intermediate & Advanced SEO | | witmartmarketing0 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0