Status Code: 404 Errors. How to fix them.
-
Hi,
I have a question about the "4xx Staus Code" errors appearing in the Analysis Tool provided by SEOmoz. They are indicated as the worst errors for your site and must be fixed. I get this message from the good people at SEOmoz:
"4xx status codes are shown when the client requests a page that cannot be accessed. This is usually the result of a bad or broken link."
Ok, my question is the following. How do I fix them? Those pages are shown as "404" pages on my site...isn't that enough? How can fix the "4xx status code" errors indicated by SEOmoz?
Thank you very much for your help.
- Sal
-
Why not 301 the 404s to similar pages? Fix your problem AND transfer some of the link juice.
-
As Ben Fox stated, you can use the report to find the linking errors.
I'd also run a scan of your site using Xenu Link Sleuth (it's 100% free) if you're a PC user. Some people prefer Screaming Frog (both work well, Screaming Frog has a free and paid version to my knowledge)
I use Xenu personally, been using it for years with much success. You'd be surprised what kind of stuff it digs up.
-
Hi Sal,
If you look for the referrer column in the report you can see which pages are linking to the broken URLs.
Fix these broken links and you won't be generating so many 4xx pages.
That's the theory anyway. It can be a pretty arduous task but if you stick to it you should be able to get that number down.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fix Duplicate Content Before Migration?
My client has 2 Wordpress sites (A and B). Each site is 20 pages, with similar site structures, and 12 of the pages on A having nearly 100% duplicate content with their counterpart on B. I am not sure to what extent A and/or B is being penalized for this. In 2 weeks (July 1) the client will execute a rebrand, renaming the business, launching C, and taking down A and B. Individual pages on A and B will be 301 redirected to their counterpart on C. C will have a similar site structure to A and B. I expect the content will be freshened a bit, but may initially be very similar to the content on A and B. I have 3 questions: Given that only 2 weeks remain before the switchover - is there any purpose in resolving the duplicate content between A and B prior to taking them down? Will 301 redirects from penalized pages on A or B actually hurt the ranking of the destination page on C? If a page on C has the same content as its predecessor on A or B, could it be penalized for that, even though the page on A or B has since been taken down and replaced with a 301 redirect?
Intermediate & Advanced SEO | | futumara0 -
404's after pruning old posts
Hey all, So after reading about the benefits of pruning old content I decided to give it a try on our blog. After reviewing thousands of posts I found around 2500 that were simply not getting any traffic, or if they were there was 100% bounce & exit. Many of these posts also had content with relevance that had long ago expired. After deleted these old posts, I am now seeing the posts being reported as 404's in Google Search Console. But most of them are the old url with "trashed" appended to the url. My question is: are these 404's normal? Do I now have to go through and set up 301's for all of these? Is it enough to simply add the lot to my robots.txt file? Are these 404's going to hurt my blog? Thanks, Roman
Intermediate & Advanced SEO | | Dynata_panel_marketing0 -
URL Errors for SmartPhone in Google Search Console/Webmaster Tools
Howdy all, In recent weeks I have seen a steady increase in the number of smartphone related url errors on Googles Search Console (formerly webmaster tools). THe crawler appears to be searching for a /m/ or /mobile/ directory within the URLs. Why is it doing this? Any insight would be greatly appreciated. Unfortunately this is for an unresponsive site, would setting the viewport help stop the issue for know until my new responsive site is launched shortly. Cheers fello Mozzers 🙂 Tim NDh1RNs
Intermediate & Advanced SEO | | TimHolmes1 -
Sitemap error
Hey Guys Everytime I run the tester through google webmaster tools - I keep getting an error that tells me "Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead." An idea how to go about fixing this without changing the site around? https://www.zenory.co.nz/sitemap I have seen competitors sitemaps look similar to mine. Cheers
Intermediate & Advanced SEO | | edward-may0 -
Is Content Location Determined by Source Code or Visual Location in Search Engine's Mind?
I have a page with 2 scroll features. First 1/3 of the page (from left) has thumb pictures (not original content) and a vertical scroll next to. Remaining 2/3 of the page has a lot of unique content and a vertical scroll next to it. Question: Visually on a computer, the unique content is right next to the thumbs, but in the source code the original content shows after these thumbs. Does that mean search engines will see this content as "below the fold" and actually, placing this content below the thumbs (requiring a lot of scrolling to get to the original content) would in a search engine's mind be the exact same location of the content, as the source code shows the same location? I am trying to understand if search engines base their analysis on source code or also visual location of content? thx
Intermediate & Advanced SEO | | khi50 -
Crawl diagnostic how important is these 2 types of errors and what to do?
Hi,
Intermediate & Advanced SEO | | nicolaj1977
I am trying to SEO optimized my webpage dreamesatehuahin.com When I saw SEO Moz webpage crawl diagnostic I kind of got a big surprise due to the high no. of errors. I don’t know if this is the kind of errors that need to be taken very serious i my paticular case, When I am looking at the details I can see the errors are cause by the way my wordpress theme is put together. I don’t know how to resolve this. But If important I might hire a programmer. DUPLICATE ERRORS (40 ISSUES HIGH PRIORITY ACCORDING TO MOZ)
They are all the same as this one.
http://www.dreamestatehuahin.com/property-feature/restaurent/page/2/
is eaqual to this one
http://www.dreamestatehuahin.com/property-feature/restaurent/page/2/?view=list This one exsist
http://www.dreamestatehuahin.com/property-feature/car-park/
while a level down don’t exsit
http://www.dreamestatehuahin.com/property-feature/ DUPLICATE PAGE TITLE (806 ISSUES MEDIUM PRIORITY ACCORDING TO MOZ)
This is related to search results and pagination.
Etc. Title for each of these pages is the same
http://www.dreamestatehuahin.com/property-search/page/1 http://www.dreamestatehuahin.com/property-search/page/2 http://www.dreamestatehuahin.com/property-search/page/3 http://www.dreamestatehuahin.com/property-search/page/4 Title element is to long (405)
http://www.dreamestatehuahin.com/property-feature/fitness/?view=list
this is not what I consider real pages but maybe its actually is a page for google. The title from souce code is auto generated and in this case it not makes sense
<title>Fitness Archives - Dream Estate Hua Hin | Property For Sale And RentDream Estate Hua Hin | Property For Sale And Rent</title> I know at the moment there are properly more important things for our website like content, title, meta descriptions, intern and extern links and are looking into this and taking the whole optimization seriously. Have for instance just hired a content writer rewrite and create new content based on keywords research. I WOULD REALLY APPRICIATE SOME EXPERIENCE PEOPLE FEEDBACK ON HOW IMPORTANT IS IT THAT I FIX THIS ISSUES IF AT ALL POSSIBLE? best regards, Nicolaj1 -
How to fix issues from 301s
Case: We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits. Issue: My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline. Proposed resolution: I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up. Question 1: Is the assumption that the decline could be because of missed authority signals reasonable? Question 2: Could the proposed solution be harmful? Question 3: Will the proposed solution be adequate to resolve the issue? Any help would be sincerely appreciated. Thank you in advance, David
Intermediate & Advanced SEO | | FireMountainGems0 -
Possible Penguin 2.1 fix - Anybody tested this?
Possible Penguin 2.1 fix? This happened to client site - Stay with me - this takes some explaining… A clients home page is set as index.html Which in domain settings goes to the root address: http://www.domain.com/ But is a setting on a domain/hosting - you can set any page to the root- I always link directly to the root address (the second one) So if you set the new root page as http://www.domain.com/index.htm --- going to the root - essentially is a new page- any previous poor linking would be then broken and would have no effect So it would be a matter of changing the domain settings to use the index.htm page (which would function exactly the same- internal link structure of site goes to the root) thoughts?
Intermediate & Advanced SEO | | OnlineAssetPartners0