Do 404s really 'lose' link juice?
-
It doesn't make sense to me that a 404 causes a loss in link juice, although that is what I've read. What if you have a page that is legitimate -- think of a merchant oriented page where you sell an item for a given merchant --, and then the merchant closes his doors. It makes little sense 5 years later to still have their merchant page so why would removing them from your site in any way hurt your site? I could redirect forever but that makes little sense. What makes sense to me is keeping the page for a while with an explanation and options for 'similar' products, and then eventually putting in a 404. I would think the eventual dropping out of the index actually REDUCES the overall link juice (ie less pages), so there is no harm in using a 404 in this way. It also is a way to avoid the site just getting bigger and bigger and having more and more 'bad' user experiences over time.
Am I looking at it wrong?
ps I've included this in 'link building' because it is related in a sense -- link 'paring'.
-
Thanks Amelia!
-
I may be being pedantic here but I think the correct status code should be 410 not 404 if the page has gone for good and you don't have a relevant place to redirect traffic to, as per the scenario described.
I believe if Google finds a 410 page, it'll be removed from the index but because 404 is 'file not found' the page may stay in the index, potentially giving bad user experience as outlined by Matt Williamson.
However, I would always redirect if you can - even if you just send traffic to the homepage, it's got to be a better user experience than sending them to a 404 page. I think anyway!
More info here: http://moz.com/learn/seo/http-status-codes
You mention a concern over too many redirects - I think this page may help eliminate your fears: http://www.stateofdigital.com/matt-cutts-there-is-no-limit-to-direct-301-redirects-there-is-on-chains/
Thanks,
Amelia
-
Matt, thanks.. Good points for sure. My concern is that since something like 50% of new businesses close doors within 5 years, so the list of redirected urls will just keep getting bigger over time..Is that a concern? I guess over time less people will link to the defunct businesses, but I will still have to track them..maybe at some point when the number of links to them is small it would make sense to then 404 them? Of course, I'd still need to track which ones to 404, so i'm now wondering when 404 ever makes sense on prior legitimate pages..
Just to be clear -- redirecting does remove the old link from the index, right?
-
404's can loose link juice and cause the most issues when a page that had lots of link pointing to it passing authority becomes a 404 page. As this page no longer exists the authority that was being passed to it from the links that were pointing at it will be lost when Google eventually de-indexes the page. You also must remember that this page is likely to be in Google Index and if people click on it and it is not found they are more likely to bounce from your site. You will also loose what terms this page was ranking for when it is eventually de-indexed as well. Redirecting this page to its new location or a similar/relevant page will help keep most of this authority that has been earnt helping with your ranking and keeping human visitors happy.
You also need to think of this from a crawl point of view - lots of 404s doesn't make your site very friendly as Googlebot is wasting time trying to crawl pages that don't exist. Ultimately making sure you don't have 404 pages and keep on top of redirecting these is important particularly if the page had authority. A great big hint to the importance is the fact that Google reports these crawl issues in Google Webmaster Tools in order for you to be able to monitor and fix them.
On a side note I have seen cases where sites have had a lot of 404s due to a significant change of URL structure and they haven't done any redirects - they have lost the majority of their organic rankings and traffic!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Migration Question - Do I Need to Preserve Links in Main Menu to Preserve Traffic or Can I Simply Link to on Each Page?
Hi There We are currently redesigning the following site https://tinyurl.com/y37ndjpn The local pages links in the main menu do provide organic search traffic. In order to preserve this traffic, would be wise to preserve these links in the main menu? Or could we have a secondary menu list (perhaps in the header or footer), featured on every page, which links to these pages? Many Thanks In Advance for Responses
Intermediate & Advanced SEO | | ruislip180 -
We used to speak of too many links from same C block as bad, have CDN's like CloudFlare made that concept irrelevant?
Over lunch with our head of development, we were discussing the way CloudFlare and other CDN's help prevent DDOS attacks, etc. and I began to wonder about the IP address vs. the reverse proxy IP address. Before we would look to see commonalities in the IP as a way that search engines would modify the value to given links and most link software showed this. For ahrefs, I know they still show common IPs using the C block as the reference point. I began to get curious about what was the real IP when our head of dev said, that is the IP from CloudFlare... So, I ran a site in ahrefs and we got an older site we had developed years ago that showed up as follows: Actos-lawsuit.org 104.28.13.57 and again as 104.28.12.57 (duplicate C block is first three sets of numbers are the same and obviously, this has a .12 and a .13 so not duplicate.) Then we looked at our host to see what was the IP shown there: 104.239.226.120. So, this really begs a question of is C Block data or even IP address data still relevant with regard to links? What do the search engines see when they look for IP address now? Yes, I have an opinion, but would love to hear yours first!
Intermediate & Advanced SEO | | RobertFisher0 -
Which links to disavow?
I've got a new client that just fired their former SEO company, which was building spammy links like crazy! Using GSC and Majestic, I've identified 341 linking domains. I'm only a quarter of the way through the list, but it is clear that the overwhelming majority are from directories, article directories and comment spam. So far less than 20% are definitely links I want to keep. At what point do I keep directory links? I see one with a DA of 61 and a Moz spam score of 0. I realize this is a judgement call that will vary, but I'd love to hear some folks give DA and spam numbers. FWIW, the client's DA is 37.
Intermediate & Advanced SEO | | rich.owings0 -
Website Isn't Ranking & I'm Not Sure Why Based On The Data
Hi Moz Community,
Intermediate & Advanced SEO | | ErrickG
I am having an issue that has been killing me for some time and I could really use another opinion. One of my client’s websites hasn't been ranking for some time and I can't put my finger on it. There are no issues showing up in the webmaster tools. If you compare the site with the tops ranking sites for the websites number one keyword, the website is just as good as everyone else. My clients website is the first one on the left in the attachment. We have better quality content but instead of showing up on page 1,2,3 the site is on page 21. I am just at a lost. Anyone have any thoughts outside looking in. Thanks,
Errick rrLJZ2G0 -
Should I remove all vendor links (link farm concerns)?
I have a web site that has been around for a long time. The industry we serve includes many, many small vendors and - back in the day - we decided to allow those vendors to submit their details, including a link to their own web site, for inclusion on our pages. These vendor listings were presented in location (state) pages as well as more granular pages within our industry (we called them "topics). I don't think it's important any more but 100% of the vendors listed were submitted by the vendors themselves, rather than us "hunting down" links for inclusion or automating this in any way. Some of the vendors (I'd guess maybe 10-15%) link back to us but many of these sites are mom-and-pop sites and would have extremely low authority. Today the list of vendors is in the thousands (US only). But the database is old and not maintained in any meaningful way. We have many broken links and I believe, rightly or wrongly, we are considered a link farm by the search engines. The pages on which these vendors are listed use dynamic URLs of the form: \vendors<state>-<topic>. The combination of states and topics means we have hundreds of these pages and they thus form a significant percentage of our pages. And they are garbage 🙂 So, not good.</topic></state> We understand that this model is broken. Our plan is to simply remove these pages (with the list of vendors) from our site. That's a simple fix but I want to be sure we're not doing anything wring here, from an SEO perspective. Is this as simple as that - just removing these page? How much effort should I put into redirecting (301) these removed URLs? For example, I could spend effort making sure that \vendors\California- <topic>(and for all states) goes to a general "topic" page (which still has relevance, but won't have any vendors listed)</topic> I know there is no distinct answer to this, but what expectation should I have about the impact of removing these pages? Would the removal of a large percentage of garbage pages (leaving much better content) be expected to be a major factor in SEO? Anyway, before I go down this path I thought I'd check here in case I miss something. Thoughts?
Intermediate & Advanced SEO | | MarkWill0 -
Links with Parameters
The links from the home page to some internal pages on my site have been coded in the following format by my tech guys: www.abc.com/tools/page.html?hpint_id=xyz If I specify within my Google Webmaster tools that the parameter ?hpint_id should be ignored and content for the user does not change, Will Google credit me for a link from the home page or am I losing something here. Many thanks in advance
Intermediate & Advanced SEO | | harmit360 -
What's next?
What's next with the tool? For SEOmoz users that have gotten their Crawl Diagnostics and On-Page issues under control, what's next? In other words, what do long-time SEOmoz users do with the tool? What ongoing weekly value do they get? Ranking reports? Link Analysis? It took me four weeks to resolve all my simple issues, which you can see in Crawl Diagnostics and On-Page reports. (It would have only take one week, if the tool crawled all my pages upon demand instead of only once a week.) But now that all my simple issues are resolved, I'm not sure what else to do with the tool. I don't want to hastily cancel the service, but I also don't know what else to do... I'd even pay more for an actual human to look in on me from time to time and tell me what to do next. But I'm self-motivating, so I'll try to figure it out.
Intermediate & Advanced SEO | | raywhite0 -
Link Request Email on Site`s Link Pages
Hello I have assembled a list of web-sites that have "Links" section that has a list of persons` favorite tools. Those pages have a link to my competitor. I know my tool is just as good if not better and want to request a link. I`m thinking of sending an email asking for a link and offering a small amount of money for it. Questions: A) How much should I offer? Should I offer anything at all B) Is there an email style that someone can suggest that has been tested and proven to work for this type of situtation?
Intermediate & Advanced SEO | | hellopotap0