Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Too many on page links
-
Hi
I know previously it was recommended to stick to under 100 links on the page, but I've run a crawl and mine are over this now with 130+
How important is this now? I've read a few articles to say it's not as crucial as before.
Thanks!
-
Hi Becky!
First, I would like to say this is it great you are being proactive in making sure your webpage doesn't have too many links on it! But, luckily for you, this is not something you need to worry about. 100 is a suggested number but not something that will penalize you if you go over.
Google’s Matt Cutts posted a video explaining why Google no longer has that 100-links-per-page Webmaster guideline—so be sure to check that out! It's commonly thought that having too many links will negatively impact your SEO results, but that hasn't been the case since 2008. However, Google has said if a site looks to be spammy and has way too many links on a single page—Google reserves the right to take action on the site. So, don't include links that could be seen as spammy and you should be fine.
Check out this Moz blog that discusses how many links is too many for more information!
-
Thank you for the advice, I'll take a look at the articles
Brilliant, the round table sounds great - I'll sign up for this
-
I honestly wouldn't worry Becky. The page looks fine, the links look fine and it is certainly not what you would call spammy,
Link crafting was a 'thing' a number of years ago, but today Google pretty much ignores this, as has been shown many times in testing.
However, you can benefit from internal links, but that is a different discussion. Read this if you are interested.
If you are interested, there is a round-table discussion on eCommerce SEO hosted by SEMrush on Thursday and that could be useful to you? Myself and 2 others will be talking on a number of issues.
-Andy
-
Thanks for the advice, I've looked into this before.
We have menu links and product links as it's an ecommerce site, so I wouldn't be able to remove any of these.
I've found it hard to find a way to decrease these links further on primary pages. For example http://www.key.co.uk/en/key/aluminium-sack-truck has 130 links.
Any advice would be appreciated
-
Confirmation from Google here to limit the links on a page to 3000
https://www.deepcrawl.com/knowledge/news/google-webmaster-hangout-notes-friday-8th-july-2016/
I would consider that to be a lot though
-Andy
-
Brilliant thank you!
-
In the "old days" (yup, I go back that far), Google's search index crawler wasn't all that powerful. So it would ration itself on each page and simply quit trying to process all the content on the page after a certain number of links and certain character count. (That's also why it used to be VERY important that your content was close to the top of your page code, not buried at the bottom of the code).
The crawler has been beefed up to the point where this hasn't been a limiting factor per page for a long time, so the crawler will traverse pretty well any links you feed it. But I +1 both Andy and Mike's advice about considering the usability and link power dilution of having extensive numbers of links on a page. (This is especially important to consider for your site's primary pages, since one of their main jobs is to help flow their ranking authority down to important/valuable second-level pages.)
Paul
-
Hi Becky,
Beyond the hypothetical limit, would be the consideration of dividing the link authority of the page by a really large number of links and therefor decreasing the relative value of each of those links to the pages they link to.
Depending on the page holding all these links, user experience, purpose of linked-to pages, etcetera, this may or may not be a consideration, but worth thinking about.
Good luck!
- Mike
-
Hi Becky,
If the links are justified, don't worry. I have clients with 3-400 and no problems with their positions in Google.
That doesn't mean to say it will be the same case for everyone though - each site is different and sometimes you can have too many, but just think it through and if you come to the conclusion that most of the links aren't needed and are stuffing keywords in, then look to make changes.
But on the whole, it doesn't sound like an issue to me - there are no hard and fast rules around this.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Google Bot View Links on a Wix Page?
Hi, The way Wix is configured you can't see any of the on-page links within the source code. Does anyone know if Google Bots still count the links on this page? Here is the page in question: https://www.ncresourcecenter.org/business-directory If you do think Google counts these links, can you please send me URL fetcher to prove that the links are crawlable? Thank you SO much for your help.
Intermediate & Advanced SEO | | Fiyyazp0 -
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
H3 Tags - Should I Link to my content Articles- ? And do I have to many H3 tags/ Links as it is ?
Hello All, On my ecommerce landing pages, I currently have links to my products as H3 Tags. I also have useful guides displayed on the page with links useful articles we have written (they currently go to my news section). I am wondering if I should put those article links as additional H3 tags as well for added seo benefit or do I have to many tags as it is ?. A link to my Landing Page I am talking about is - http://goo.gl/h838RW Screenshot of my h1-h6 tags - http://imgur.com/hLtX0n7 I enclose screenshot my guides and also of my H1-H6 tags. Any advice would be greatly appreciated. thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Link Research Tools - Detox Links
Hi, I was doing a little research on my link profile and came across a tool called "LinkRessearchTools.com". I bought a subscription and tried them out. Doing the report they advised a low risk but identified 78 Very High Risk to Deadly (are they venomous?) links, around 5% of total and advised removing them. They also advised of many suspicious and low risk links but these seem to be because they have no knowledge of them so default to a negative it seems. So before I do anything rash and start removing my Deadly links, I was wondering if anyone had a). used them and recommend them b). recommend detoxing removing the deadly links c). would there be any cases in which so called Deadly links being removed cause more problems than solve. Such as maintaining a normal looking profile as everyone would be likely to have bad links etc... (although my thinking may be out on that one...). What do you think? Adam
Intermediate & Advanced SEO | | NaescentAdam0 -
Site wide footer links vs. single link for websites we design
I’ve been running a web design business for the past 5 years, 90% or more of the websites we build have a “web design by” link in the footer which links back to us using just our brand name or the full “web design by brand name” anchor text. I’m fully aware that site-wide footer links arent doing me much good in terms of SEO, but what Im curious to know is could they be hurting me? More specifically I’m wondering if I should do anything about the existing links or change my ways for all new projects, currently we’re still rolling them out with the site-wide footer links. I know that all other things being equal (1 link from 10 domains > 10 links from 1 domain) but is (1 link from 10 domains > 100 links from 10 domains)? I’ve got a lot of branded anchor text, which balances out my exact match and partial match keyword anchors from other link building nicely. Another thing to consider is that we host many of our clients which means there are quite a few on the same server with a shared IP. Should I? 1.) Go back into as many of the sites as I can and remove the link from all pages except the home page or a decent PA sub page- keeping a single link from the domain. 2.) Leave all the old stuff alone but start using the single link method on new sites. 3.) Scratch the site credit and just insert an exact-match anchor link in the body of the home page and hide with with CSS like my top competitor seems to be doing quite successfully. (kidding of course.... but my competitor really is doing this.)
Intermediate & Advanced SEO | | nbeske0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0