Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
If I nofollow outbound external links to minimize link juice loss > is it a good/bad thing?
-
OK, imagine you have a blog, and you want to make each blog post authoritative so you link out to authority relevant websites for reference. In this case it is two external links per blog post, one to an authority website for reference and one to flickr for photo credit. And one internal link to another part of the website like the buy-now page or a related internal blog post.
Now tell me if this is a good or bad idea. What if you nofollow the external links and leave the internal link untouched so all internal links are dofollow. The thinking is this minimizes loss of link juice from external links and keeps it flowing through internal links to pages within the website.
Would it be a good idea to lay off the nofollow tag and leave all as do follow? or would this be a good way to link out to authority sites but keep the link juice internal?
Your thoughts are welcome. Thanks.
-
Just a little more info from Google here as well on how Pagerank Sculpting no longer works...
http://www.thesempost.com/google-pagerank-sculpting-still-doesnt-work/
-
I'm with inbound.org, and second what ThompsonPaul says. This email was about not indexing profiles that are incomplete and have thin content, and doesn't have anything to do with outbound links.
My take on links I make out from my own website:
- Nofollow affiliate links
- Nofollow links I don't trust -- but I generally won't link to things I don't trust, or would just make it so there's a space in the URL or it otherwise doesn't link
- Leave most every link followed. It's my site, I'm going to link out to sites I trust. If I have comments, those will be nofollow, as I'm not the author and not endorsing where the comments are linking.
Good info from Matt Cutts here about how nofollow hasn't been used to 'conserve' link equity in some time. https://www.mattcutts.com/blog/pagerank-sculpting/
-
thank you good sir.
-
As I mention in my other comment, Sandi, no-following links doesn't preserve "SEO juice" at all. That hasn't been the case in many years.
And what Inbound is doing is completely different. They are No-Indexing entire pages that had so little content on them that they had no value, were wasting the site's crawl budget and looked like thin/duplicate content to the search engines. Nothing to do with the links on them at all. (This is actually a best practice for any site, but especially directory-type sites.)
P.
-
No-following links has ABSOLUTELY ZERO EFFECT on preserving "link juice" of a page, Rich. This used to be the case six years ago when no-follow for links was first introduced, but it was being abused so badly that search engines changed this behaviour. (This used to be referred to as PageRank sculpting)
Further to Andy's and Dmytro's comments - Google is clear there are only three circumstances when no-follow should be used:
- you have a commercial relationship with the page you're linking too (paid links, but also many guest post scenarios for example)
- you didn't create the link and therefore can't trust it (e.g. user comments or other user generated content)
- you are linking to an unreliable site (to demonstrate a bad example,for instance)
- (and a bonus fourth) links to administrative-type pages that wouldn't be of any use to a search visitor like a privacy/terms of service or login page).
There's also been considerable discussion that Google in particular considers no-following of all external links a sign of unnatural manipulation that could damage page authority.
So... conceptually a good idea at one time, but no longer valid and potentially harmful.
Hope that helps?
Paul
-
Thanks Andy!
-
Hi Rich,
Don't nofollow for the sake of it. If a link is paid for, then yes, you should nofollow this, but that is probably one of the very few occasions i would suggest you do it.
Perhaps if you have written a blog post and then were asked to inject a link into it, then I would be tempted to nofollow that, but I wouldn't do it to try and retain link juice - that isn't really a tactic these days.
Google wants to see you link to sites externally, as long as it is called for - this will help show your authority as well.
-Andy
-
Hi,
I don't necessarily agree that too many outbound links can harm your own SEO. In fact, Matt Cutts has tons of outbound links on his blog, so as long as links are relevant from a user perspective there shouldn't be any issues.
Back to the follow/nofollow, if you are linking out to trusted and relevant sources, I don't see any reason to nofollow the links. On the other hand, if you have user generated content, I would nofollow external links, because you won't always know where are they linking out.
Hope this helps!
Dmytro
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallow: /jobs/? is this stopping the SERPs from indexing job posts
Hi,
Intermediate & Advanced SEO | Sep 14, 2018, 4:57 AM | JamesHancocks1
I was wondering what this would be used for as it's in the Robots.exe of a recruitment agency website that posts jobs. Should it be removed? Disallow: /jobs/?
Disallow: /jobs/page/*/ Thanks in advance.
James0 -
What does Disallow: /french-wines/?* actually do - robots.txt
Hello Mozzers - Just wondering what this robots.txt instruction means: Disallow: /french-wines/?* Does it stop Googlebot crawling and indexing URLs in that "French Wines" folder - specifically the URLs that include a question mark? Would it stop the crawling of deeper folders - e.g. /french-wines/rhone-region/ that include a question mark in their URL? I think this has been done to block URLs containing query strings. Thanks, Luke
Intermediate & Advanced SEO | Mar 21, 2017, 10:39 AM | McTaggart0 -
Link juice through URL parameters
Hi guys, hope you had a fantastic bank holiday weekend. Quick question re URL parameters, I understand that links which pass through an affiliate URL parameter aren't taken into consideration when passing link juice through one site to another. However, when a link contains a tracking URL parameter (let's say gclid=), does link juice get passed through? We have a number of external links pointing to our main site, however, they are linking directly to a unique tracking parameter. I'm just curious to know about this. Thanks, Brett
Intermediate & Advanced SEO | Aug 30, 2016, 7:25 AM | Brett-S0 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | Jan 26, 2015, 6:01 PM | MiguelSalcido0 -
Can a large fluctuation of links cause traffic loss?
I've been asked to look at a site that has lost 70/80% if their search traffic. This happened suddenly around the 17th April. Traffic dropped off over a couple of days and then flat-lined over the next couple of weeks. The screenshot attached, shows the impressions/clicks reported in GWT. When I investigated I found: There had been no changes/updates to the site in question There were no messages in GWT indicating a manual penalty The number of pages indexed shows no significant change There are no particular trends in keywords/queries affected (they all were.) I did discover that ahrefs.com showed that a large number of links were reported lost on the 17th April. (17k links from 1 domain). These links reappeared around the 26th/27th April. But traffic shows no sign of any recovery. The links in question were from a single development server (that shouldn't have been indexed in the first place, but that's another matter.) Is it possible that these links were, maybe artificially, boosting the authority of the affected site? Has the sudden fluctuation in such a large number of links caused the site to trip an algorithmic penalty (penguin?) Without going into too much detail as I'm bound by client confidentiality - The affected site is really a large database and the links pointing to it are generated by a half dozen or so article based sister sites based on how the articles are tagged. The links point to dynamically generated content based on the url. The site does provide a useful/valuable service/purpose - it's not trying to "game the system" in order to rank. That doesn't mean to say that it hasn't been performing better in search than it should have been. This means that the affected site has ~900,000 links pointing to is that are the names of different "entities". Any thoughts/insights would be appreciated. I've expresses a pessimistic outlook to the client, but as you can imaging they are confused and concerned. LVSceCN.png
Intermediate & Advanced SEO | Jun 4, 2014, 3:44 AM | DougRoberts0 -
Do 404 pages pass link juice? And best practices...
Last year Google said bad links to 404 pages wouldn't hurt your site. Could that still be the case in light of recent Google updates to try and combat spammy links and negative SEO? Can links to 404 pages benefit a website and pass link juice? I'd assume at the very least that any link juice will pass through links FROM the 404 page? Many websites have great 404 pages that get linked to: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fretardzone.com%2F404 - that was the first of four I checked from the "60 Really Cool...404 Pages" that actually returned the 404 HTTP Status! So apologies if you find the word 'retard' offensive. According to Open Site Explorer it has a decent Page Authority and number of backlinks - but it doesn't show in Google's SERPs. I'd never do it, but if you have a particularly well-linked to 404 page, is there an argument for giving it 200 OK Status? Finally, what are the best practices regarding 404s and address bar links? For example, if
Intermediate & Advanced SEO | Jan 25, 2013, 10:15 AM | Alex-Harford
www.examplesite.com/3rwdfs returns a 404 error, should I make that redirect to
www.examplesite.com/404 or leave it as is? Redirecting to www.examplesite.com/404 might not be user-friendly as people won't be able to correct the URL in the address bar. But if I have a great 404 page that people link to, I don't want links going to loads of random pages do I? Is either way considered best practice? If I did a 301 redirect I guess it would send the wrong signal to the crawlers? Should I use a 302 redirect, or even a 304 Not Modified redirect?1 -
Max # of Products / Links per Page on E-Commerce Site
We are getting ready to re-launch our e-commerce site and are trying to decide how many products to list per category page. Some of of our category pages have upwards of 100 products. While I'd love to list ALL the products on the root category page (to reduce hassle for customer, to index more products on a higher PR page), I'm a little worried about having it be too long, and containing too many on-page links. Would love some guidance on: Maximum number of internal links on a page If Google frowns on really long category pages Anything else I should be considering when making this decision Thanks for your input!
Intermediate & Advanced SEO | Feb 25, 2012, 3:28 AM | AndrewY2 -
Finding broken links / resources by topic
Hi fellow mozzers! In an effort to ensure we're exploring every avenue when launching our new website, I was hoping to find some useful broken links / resources that we could incorporate into our link building. We have used the standard tools for this (W3C, Xenu etc), but they all seem to have the same issue in that they reveal all the missing links on a site (although some don't actually tell you the page they are on), but you still have to sort them to see if the links/ resource is related to your theme. When you're on a niche site, this obviously isn't an issue, but on a site like Mashable (to use the example given in a recent SEOmoz blog) it could result in wading through hundreds of links to find one relevant one right at the end. Is there a tool that allows you to specify what theme links you are looking for from a site, or better yet one that allows you to check multiple sites for multiple missing themed links in one go? Or is the best way to export the list and just search the document for certain keywords?
Intermediate & Advanced SEO | Feb 6, 2012, 3:10 PM | themegroup0