Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Are 404 Errors a bad thing?
-
Good Morning...
I am trying to clean up my e-commerce site and i created a lot of new categories for my parts...
I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google.
Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page...
In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled.
Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt
Thanks
-
Hello Prime85,
Both these answers are correct in their own way, but let me clarify and add my 2 cents.
1. 404s don't hurt your rankings directly, but they can provide a poor user experience.
2. If you keep URLs "live" - then Google can keep these URLs in their index indefinitely. This means search engines may waste time and crawling resources visiting pages you don't want in the index, while ignoring your other pages. This CAN hurt your SEO.
Long story short, (like Brian says) if the page is no longer relevant, you should remove it from the index or redirect it to another URL.
3. Returning a 404 kills all link juice that may have gone to the page, and it can also send confusing signals to search engines about the structure of your site if you have a bunch of pages returning 404s at the same time you have a bunch of new, but similar, pages popping into existence.
The best policy is to set up a 301 redirect from your outdated pages to the most relevant new pages. Don't redirect everything to a single page like the homepage, but instead the redirect to the page that would be most relevant and useful for the user.
On the other hand, if it's a low-value page and there's really no need to redirect it, you should remove it from the index. There's a couple ways to do this:
- Put a meta robots "NOINDEX" tag in the head and wait for Google to crawl the page and process the noindex. It helps if the URL is listed in your sitemap so that they can more easily "find" the url.
- Block the URL through robots.txt, then use Google's Remove URL tool in Webmaster tools
- Return a 404, and use the Remove URL tool
Hope this helps! Best of luck with your SEO.
-
Hi,
Here is what google says about 404s
"Q: Do the 404 errors reported in Webmaster Tools affect my site’s ranking?
A: 404s are a perfectly normal part of the web; the Internet is always changing, new content is born, old content dies, and when it dies it (ideally) returns a 404 HTTP response code. Search engines are aware of this; we have 404 errors on our own sites, as you can see above, and we find them all over the web. In fact, we actually prefer that, when you get rid of a page on your site, you make sure that it returns a proper 404 or 410 response code (rather than a “soft 404”). "" If some URLs on your site 404, this fact alone does not hurt you or count against you in Google’s search results. However, there may be other reasons that you’d want to address certain types of 404s. For example, if some of the pages that 404 are pages you actually care about, you should look into why we’re seeing 404s when we crawl them! If you see a misspelling of a legitimate URL (www.example.com/awsome instead of www.example.com/awesome), it’s likely that someone intended to link to you and simply made a typo. Instead of returning a 404, you could 301 redirect the misspelled URL to the correct URL and capture the intended traffic from that link. You can also make sure that, when users do land on a 404 page on your site, you help them find what they were looking for rather than just saying “404 Not found."
http://googlewebmastercentral.blogspot.com/2011/05/do-404s-hurt-my-site.html
I would recommend you to 301 redirect them to the appropriated page and 404 what you dont find a good page to redirect.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
Subpage with own homepage and navigation good or bad?
Hi everybody, I have the following question. At the company I work, we deliver several services. We help people buy the right second hand car (technical inspections). But we also have an import-service. Because those services are so different, I want to split them on our website. So our main website is all about the technical inspections. Then, when you click on import, you go to www.example.com/import. A subpage with it's own homepage en navigation, all about the import service. It's like you have an extra website on the same domain. Does anyone has experience with this in terms of SEO? Thank you for your time! Kind regards, Robert
Technical SEO | | RobertvanHeerde0 -
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
422 vs 404 Status Codes
We work with an automotive industry platform provider and whenever a vehicle is removed from inventory, a 404 error is returned. Being that inventory moves so quickly, we have a host of 404 errors in search console. The fix that the platform provider proposed was to return a 422 status code vs a 404. I'm not familiar with how a 422 may impact our optimization efforts. Is this a good approach, since there is no scalable way to 301 redirect all of those dead inventory pages.
Technical SEO | | AfroSEO0 -
Expired domain 404 crawl error
I recently purchased a Expired domain from auction and after I started my new site on it, I am noticing 500+ "not found" errors in Google Webmaster Tools, which are generating from the previous owner's contents.Should I use a redirection plugin to redirect those non-exist posts to any new post(s) of my site? or I should use a 301 redirect? or I should leave them just as it is without taking further action? Please advise.
Technical SEO | | Taswirh1 -
Backlinks that we have if they are 404?
Hi All, Backlinks that we have if they are 404? Open site explorer shows 1,000 of links and when I check many are 404 and those are spammy links which we had but now the sites are 404 I am doing a link profile check which is cleaning up all spammy links Should i take any action on them? As open site explorer or Google still shows these links on the searches. Should we mention these URL's in disallow in Google webmaster. Thanks
Technical SEO | | mtthompsons0 -
500 Server Error on RSS Feed
Hi there, I am getting multiple 500 errors on my RSS feed. Here is the error: <dt>Title</dt> <dd>500 : Error</dd> <dt>Meta Description</dt> <dd>Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 500 Internal Server Error</dd> <dt>Meta Robots</dt> <dd>Not present/empty</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> Any ideas as to why this is happening, they are valid feeds?
Technical SEO | | mistat20000 -
Using Sitemap Generator - Good/Bad?
Hi all I recently purchased the full licence of XML Sitemap Generator (http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html) but have yet used it. The idea behind this is that I can deploy the package on each large e-commerce website I build and the sitemap will be generated as often as I set it be and the search engines will also be pinged automatically to inform them of the update. No more manual XML sitemap creation for me! Now it sounds great but I do not know enough about pinging search engines with XML sitemap updates on a regular basis and if this is a good or bad thing? Can it have any detrimental effect when the sitemap is changing (potentially) every day with new URLs for products being added to the site? Any thoughts or optinions would be greatly appreciated. Kris
Technical SEO | | yousayjump0