What is the best approach to handling 404 errors?
-
Hello All - I'm a new here and working on the SEO on my site www.shoottokyo.com. When I am finding 4xx (Client Errors) what is the best way to deal with them? I am finding an error like this for example: http://shoottokyo.com/2010/11/28/technology-and-karma/ This may have been caused when I updated my permalinks from shoottokyo.com/2011/09/postname to shoottokyo.com/postname. I was using the plug in Permalinks moved permanently to fix them.
Sometimes I am able to find http://shoottokyo.com/a-very-long-week/www.newscafe.jp and I can tell that I simply have a bad link to News Cafe and I can go to the post and correct it but in the case of the first one I can't find out where the crawler even found the problem. I'm using Wordpress. Is it best to just use a plugin like 'Redirection' to move the rest that have errors where I cannot find the source of the issue?
Thanks
Dave
-
Hi Dave
404 errors will happen on website and you dont have to usually worry about them ( unless they are in alarmingly high numbers ) . You only want to worry about 301ing 404 pages when you are losing link juice with those.
I would use these 3 methods to find 404s on the site
-
Like Chris mentioned using Screaming Frog
-
Use your Analytics Package and search for traffic landing on the 404 page
-
Use Google Bing Webmaster Tools and see the 404 message warning ( in Crawl Stats area )
Form here you would want to 301 all valid 404 error pages to the close resembling pages ( that visitors will find useful ).
-
-
I haven't used that one but I just read up on it. It looks good.
-
Thanks for the fast response Chris. Is the best approach to 301 them using a PlugIn like Redirection? Is there a better approach or is there downsides to using a plug in to handle this?
-
Dave, you can use a tool like ScreamingFrog or Xenu's Lunk Sleuth to find links pointing to the 404 pages. You can leave the pages to 404 unless you can see in your stats that search was sending you traffic to those pages or you have external links going to them--in that case you'll want to 301 them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Moz generate 404 Broken links report in an excel format
Hi all, Hope you are doing good and that all is well at your end. I'm a marketing team member here at Zephyr and I take care of the content side things for the team. I'm reaching out to you this morning in regards to figuring whether any Moz feature can be used for fixing broken links on our website and whether or not it can generate the broken links report which can be exported and saved in a spreadsheet. We are in the middle of cleaning up all the broken links which are currently live on our drupal site and which is why we are in need of a tool that generates the report of all the broken links onto an excel spreadsheet.Once we have the list of broken links set in a spreadsheet, we will work towards fixing those links. Meanwhile, we have also created a 404 error image for our readers who can be directed towards our homepage. Please, let me know if you can help me take care of these below-mentioned requests: 1. Suggest a tool /Moz feature which can generate the site report and list of broken links onto excel/spreadsheet 2. Once I've the list in an excel format, is there an automated way in which we can fix all the broken links which are currently live on our website, instead of manually deleting/unlinking those pages. Truly appreciate your help. Thank you! Is there an automated way in which we can fix all the broken links which are currently live on our website, instead of manually deleting/unlinking those pages.
Moz Pro | | LilianB0 -
What's my best strategy for Duplicate Content if only www pages are indexed?
The MOZ crawl report for my site shows duplicate content with both www and non-www pages on the site. (Only the www are indexed by Google, however.) Do I still need to use a 301 redirect - even if the non-www are not indexed? Is rel=canonical less preferable, as usual? Facts: the site is built using asp.net the homepage has multiple versions which use 'meta refresh' tags to point to 'default.asp'. most links already point to www Current Strategy: set the preferred domain to 'www' in Google's Webmaster Tools. set the Wordpress blog (which sits in a /blog subdirectory) with rel="canonical" to point to the www version. Ask programmer to add 301 redirects from the non-www pages to the www pages. Ask programmer to use 301 redirects as opposed to meta refresh tags & point all homepage versions to www.site.org. Does this strategy make the most sense? (Especially considering the non-indexed but existent non-www pages.) Thanks!!
Moz Pro | | kimmiedawn0 -
When I did my first crawl, I was given some errors.
Do I then need to re-crawl to make sure the errors were fixed accordingly?
Moz Pro | | immortalgamer0 -
How to resolve Duplicate Content crawl errors for Magento Login Page
I am using the Magento shopping cart, and 99% of my duplicate content errors come from the login page. The URL looks like: http://www.site.com/customer/account/login/referer/aHR0cDovL3d3dy5tbW1zcGVjaW9zYS5jb20vcmV2aWV3L3Byb2R1Y3QvbGlzdC9pZC8xOTYvY2F0ZWdvcnkvNC8jcmV2aWV3LWZvcm0%2C/ Or, the same url but with the long string different from the one above. This link is available at the top of every page in my site, but I have made sure to add "rel=nofollow" as an attribute to the link in every case (it is done easily by modifying the header links template). Is there something else I should be doing? Do I need to try to add canonical to the login page? If so, does anyone know how to do it using XML?
Moz Pro | | kdl01 -
After fixing errors can I re-crawl for diagnostics?
As I am fixing errors will the campaign automatically update to show where I have fixed issues?
Moz Pro | | eidna220 -
Site explore reporting error over week
unable to dispaly anchor text error Doh! Roger is still working out the kinks with the new index and is having issues untangling anchor text data. We're currently showing anchor text data from the previous index, but we will update as soon as we can.
Moz Pro | | 1step2heaven120 -
Convince me to stay! How should I best use SEOMoz tools.
Hey folks. I'm here on a free trial and I'm really loving the Q&A forum. I've learned a lot in the last couple of days. I've been playing around with the SEOMoz tools and campaigns and I'm trying to decide if it will be worth it for me to pay the monthly fee to continue with these once my trial is up. I direct two main websites. One is an informational site and the other is a real estate site. I've started a campaign for each and so far we're still waiting for a crawl. (I'm assuming that will happen on Sunday). I've played around with some of the tools. They keyword difficulty tool is cool but I can't see me using it a whole lot. The rest is just a little overwhelming...I'm not sure where to start. So what should I be checking out next? Have at 'er and convince me to stay!
Moz Pro | | MarieHaynes3 -
The Site Explorer crawl shows errors for files/folders that do not exist.
I'm fairly certain there is ultimately something amiss on our server but the Site Explorer report on my website (www.kpmginstitutes.com) is showing thousands of folders that do not exist. Example: For my "About Us" page (www.kpmginstitutes.com/about-us.aspx), the report shows a link: www.kpmginstitutes.com/rss/industries/404-institute/404-institute/about-us.aspx. We do have "rss", "industries", "404-institute" folders but they are parallel in the architecture, not sequential as indicated in the error url. Has anyone else seen these types of error in your Site Explorer reports?
Moz Pro | | dturkington0