Is having too many outgoing external links bad?
-
I'm currently writing articles for my site Eugene Computer Geeks and remember reading somewhere that having more then 100 outgoing links is a bad idea.
I plan on writing lots of guides, and most of them will have several relevant links.
Some examples are: a virus removal/prevention guide I want to link to the different antivirus programs I'm recommending. Or on the "Free WIFI in Eugene, OR" guide, I plan on linking to all the businesses' websites that offer free wifi.
**Would having too many outgoing links hurt my rankings in anyway? **
If so, should I use the "nofollow" tag to prevent any harm? I always thought that having lots of relevant outgoing links was a good thing, but lately have been reading otherwise. What is all your opinions here at SeoMoz?
-
This post from Dr. Pete also explains the 100 link guideline, its origin, and how to think about it.
-
Thanks so much. You couldn't have answered my question better.
-
As long as you're linking to quality sites, those links will not directly hurt your rankings. If they are good for users and don't make your site look too spammy, then use them.
The general best practice is to have no more than 100 links on a page (followed or nofollowed). The main reason for this is that each link detracts from the amount of PageRank that flows through the links. This is meaningful for controlling the internal linking structure of your site to maximize your site's PageRank and help Google to understand which pages are most important. You can get an idea of how the PageRank link juice flows using this tool: http://www.ecreativeim.com/pagerank-link-juice-calculator.php
But if you really need to have more than 100 links, then use more than 100. Ultimately what gives the best experience for your users is going to help your site the most -- and while PageRank is still a ranking factor, it's not a top one.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disavow a big part of my external link profile
Hi There, With the latest penguin 3.0 algorithm update (on October 17th,) I noticed a drop in my rankings. Even though I didn’t receive any manual penalty because no messages have been found in WebMaster Tool, I suspect it is an algorithm penalty. For this reason, I definitively decided to clean-up my external link profile. **I am excluding it is a Panda 4.1 penalty because an extensive site structure review has been conducted quite recently. I collected external links from Webmaster Tool and Open Site Explorer. What I found is that 83% of my external links need to be disavowed because the links come either from poor directories or marketing articles that are evidently and specifically written for link building purposes. My questions are: 1) Shall an external link clean-up be set in place anyway although I didn’t receive any penalty message in order to prevent future problems with penguin algorithm? 2) Is it too dangerous to disavow 83% of external links? May such a manoeuvre destroy my actual rankings? Thanks in advance for you advices 🙂
Technical SEO | | Midleton0 -
Do Sitespect links get indexed?
I put a link on one of my websites using sitespect because the next release is not for a few weeks. The reason for the link is to pass domain authority (SEO Juice) to the linked site. In my next release I will add the link in the actual code, but am hoping that from now till then google will crawl and index this link. So the question is, will google crawl and index links adding to webpages via sitespect? Here is the code: | * [http://www.](<a class=)yourdomain.com" class="" >YourDomain |
Technical SEO | | AlyssaN
| | | Link to Sitespect: http://www.sitespect.com/0 -
Do bad links to a sub-domain which redirects to our primary domain pass link juice and hurt rankings?
Sometime in the distant past there existed a blog.domain.com for domain.com. This was before we started work for domain.com. During the process of optimizing domain.com we decided to 301 blog.domain.com to www.domain.com. Recently, we discovered that blog.domain.com actually has a lot of bad links pointing towards it. By a lot I mean, 5000+. I am curious to hear people's opinions on the following: 1. Are they passing bad link juice? 2. does Google consider links to a sub-domain being passed through a 301 to be bad links to our primary domain? 3. The best approach to having these links removed?
Technical SEO | | Shredward0 -
'External nofollow' in a robots meta tag? (advertorial links)
I believe this has never worked? It'd be an easy way of preventing any penalties from Google's recent crackdown on paid links via advertorials. When it's not possible to nofollow each external link individually, what are people doing? Nofollowing and/or noindexing the whole page?
Technical SEO | | Alex-Harford0 -
Internal Link Analysis Tool
I want to get a better handle on what internal link text (and co-occurance if possible) my site currently has. We have a lot of old blog articles that provide link juice back to the main site, but with thousands of pages, we never kept track of when we internally link to a page. Are there any tools that will provide an analysis of this? OpenSiteExplorer seems like a very tedious way to do it and it didn't appear to be 100% accurate. Also, are there any tools that will provide analysis and recommendations based on keywords targeted?
Technical SEO | | TheDude0 -
No inbound links. Should I link-build or create new content?
I have a PR4 site with good traffic but the blog is not very popular--the posts do not generate any backlinks and hardly get any traffic. Yet, I continue to kick out a new post every week. Site: http://www.stadriemblems.com/
Technical SEO | | UnderRugSwept
Blog: http://www.stadriemblems.com/blog/ I keep posting content so that Google keeps crawling the site and viewing it as fresh (and yes, I'm posting for my human visitors' benefit too!), but I'm wondering if eventually this will hurt more than help if Google detects all these new pages are not being linked to, and therefore starts viewing the site as low quality and devalues it. So should I: Keep posting Stop posting and build links to the posts Try to promote my blog to get more traffic and hope people link to it Something else or some combination of the above0 -
Internal linking to subdomains
Hi *, I have a main site called example.org, and a lot of user generated pages to foo.example.org / bar.example.org and so on. Most of those pages link back to example.org. In example.org I have a page that links to all subdomains. How can I optimize the pagerank of the list page? Should I add nofollow to subdomain sites to avoid passing link juice to those sites and keep normal linking from subdomain sites?
Technical SEO | | ngw0 -
Too many footer links?
Hi. We're working on http://www.gear-zone.co.uk/ at the moment, and I was wondering what's everyone's opinion on footer links. There's quite a lot on the page, and I was wondering if there might be a few too many. If so, what would be the best plan of action? Remove them altogether, stick them in an iframe or in a bit of JS so they can't be crawled? Thanks!
Technical SEO | | neooptic0