Status Code: 404 Errors. How to fix them.
-
Hi,
I have a question about the "4xx Staus Code" errors appearing in the Analysis Tool provided by SEOmoz. They are indicated as the worst errors for your site and must be fixed. I get this message from the good people at SEOmoz:
"4xx status codes are shown when the client requests a page that cannot be accessed. This is usually the result of a bad or broken link."
Ok, my question is the following. How do I fix them? Those pages are shown as "404" pages on my site...isn't that enough? How can fix the "4xx status code" errors indicated by SEOmoz?
Thank you very much for your help.
- Sal
-
Why not 301 the 404s to similar pages? Fix your problem AND transfer some of the link juice.
-
As Ben Fox stated, you can use the report to find the linking errors.
I'd also run a scan of your site using Xenu Link Sleuth (it's 100% free) if you're a PC user. Some people prefer Screaming Frog (both work well, Screaming Frog has a free and paid version to my knowledge)
I use Xenu personally, been using it for years with much success. You'd be surprised what kind of stuff it digs up.
-
Hi Sal,
If you look for the referrer column in the report you can see which pages are linking to the broken URLs.
Fix these broken links and you won't be generating so many 4xx pages.
That's the theory anyway. It can be a pretty arduous task but if you stick to it you should be able to get that number down.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does link position matter in the content/html code
My question is that if I have several links going to different landing pages will the one at the top of the content pass more value than ones at the bottom. Assuming that there are not more than 1 of the same link in the content. The ultimate question is whether or not link position in the content/html code make a difference if it passes more value. This question comes in response to this whiteboard Friday https://www.youtube.com/watch?v=xAH762AqUTU Rand talks about how if there are 2 links going to the same URL from the same content page then google will only inherit the value of the anchor text from the first link on the page and not the both of them. Meaning that google will treat that second link as if it doesn’t exist. There are lots of resources that shows this was true but there isn’t much content newer than 2010 that say this is still true, We all know that things have changed a lot since then Does that make sense?
Intermediate & Advanced SEO | | 97th_Floor0 -
Fix Duplicate Content Before Migration?
My client has 2 Wordpress sites (A and B). Each site is 20 pages, with similar site structures, and 12 of the pages on A having nearly 100% duplicate content with their counterpart on B. I am not sure to what extent A and/or B is being penalized for this. In 2 weeks (July 1) the client will execute a rebrand, renaming the business, launching C, and taking down A and B. Individual pages on A and B will be 301 redirected to their counterpart on C. C will have a similar site structure to A and B. I expect the content will be freshened a bit, but may initially be very similar to the content on A and B. I have 3 questions: Given that only 2 weeks remain before the switchover - is there any purpose in resolving the duplicate content between A and B prior to taking them down? Will 301 redirects from penalized pages on A or B actually hurt the ranking of the destination page on C? If a page on C has the same content as its predecessor on A or B, could it be penalized for that, even though the page on A or B has since been taken down and replaced with a 301 redirect?
Intermediate & Advanced SEO | | futumara0 -
301 page into a 404
Hi I have a job board site and the way the site is built means that I cant 404 job pages once they have expired. To combat this Im looking to 301 the pages into a 404 page.Do any of you have any experience with this? Are there any potential pitfalls to doing a 404 this way? Thanks
Intermediate & Advanced SEO | | AndrewAkesson0 -
Www vs. non-www differences in crawl errors in Webmaster tools...
Hey All, I have been working on an eCommerce site for a while that to no avail, continues to make me want to hang myself. To make things worth the developers just do not understand SEO and it seems every change they make just messes up work we've already done. Job security I guess. Anywho,most recently we realized they had some major sitemap issues as almost 3000 pages were submitted by only 20 or so were indexed. Well, they updated the sitemap and although all the pages are properly indexing, I now have 5000+ "not found" crawl errors in the non-www version of WMT and almost none in the www version of the WMT account. Anyone have insight as to why this would be?
Intermediate & Advanced SEO | | RossFruin0 -
How concerning is a message from Google about an increase in server errors?
In the past few weeks I have started getting messages from Google webmasters about an increase in server errors. According to our r&d team these messages come at times our site has been down and Google is not an accurate measure of the site health. 1 - are they correct and is there a better tool to be using? 2 - could be harmed that Google is occasionally running into this problem..that is then fixed within a few hours? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
202 error page set in robots.txt versus using crawl-able 404 error
We currently have our error page set up as a 202 page that is unreachable by the search engines as it is currently in our robots.txt file. Should the current error page be a 404 error page and reachable by the search engines? Is there more value or is it a better practice to use 404 over a 202? We noticed in our Google Webmaster account we have a number of broken links pointing the site, but the 404 error page was not accessible. If you have any insight that would be great, if you have any questions please let me know. Thanks, VPSEO
Intermediate & Advanced SEO | | VPSEO0 -
Hundreds of thousands of 404's on expired listings - issue.
Hey guys, We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000. Many of these listings receive links. Classified listings that are less than 45 days show other possible products to buy based on an algorithm. It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results. -> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised. -> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience. -> Or, shall we just leave them as 404's? : google sort of says it's ok Very curious on your opinions, and how you would handle this. Cheers, Croozie. P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
Intermediate & Advanced SEO | | sichristie0 -
How long does a Google penalty last if you have fixed the problem??
Hi I stupidly thought that it would be a good idea to set up a reciprocal links page on my website named 'links'. I did this because my competitors were linking to these pages so I though it would be a good idea and I genuinely didn't know that you could be punished for this. Within about 3 weeks my rank dropped about 3 pages. I have since removed the links and the page was cached last Friday but the site still appears to have a penalty. I assumed when Google cached the page and saw the links were not there anymore that the penalty would be lifted. Anyone got any ideas? ps. The competitor websites had broken their links pages into various categories relating to the website i.e. related directories etc. so this might be why they weren't penalized.
Intermediate & Advanced SEO | | BelfastSEO0