404 Errors in WMT
-
Currently my website have about 10,000 404 errors for my site as wordpress is adding /feed/ to the end of all url in my website.. Should I restrict /feed/ from the robot txt?
-
Yea would have my developer look at the issue for me thank you..
-
Thanks for the info via direct message. As far as I know, those /feed/ URLs should not return 404's. I checked my site for example;
http://www.evolvingseo.com/2014/08/15/hiring-evolver-number-one/feed/ - and that returns a 200 OK.
I am not sure why WordPress would be doing this to be honest. Do you have a developer working with you? Or if it's a Theme you could contact the theme vendor about it.
-
Hi There
As mentioned above - it would be optimal to see an example - or if you can't share the site, just a generic example. It may be that wordpress is adding feed URLs where they don't need to be, so we'd need to take a good look.
-
Good morning!
Before you go cutting 10,000 404's I personally would try and address why your getting 404 errors for your RSS Feed.
Having 10,000 errors is a broad number, plenty of those could be duplicates, and some of those are probably not just related to the RSS Feed. If there is anything I have learned in SEO is that I can almost never use broad strokes when painting, and if I do, I have to be absolutely SURE what my brush is covering. A little while back Matt Cutts made a video about RSS feeds and the benefit they can have to websites. They are not as important as the blog itself, but still, it's a nice feature that you could take advantage if you already have.
The reason I bring this up; if you make the broad statement the restrict /feed/ how to you know for certain that you aren't cutting off other pages that have helped?
I don't know enough about your website to truly advise but I would take a look at all of those errors, put them into a spreadsheet and first get rid of all duplicates and pull out all of the /feed/ 404's to try and get as specific of a number as possible.
Look in your referrals in GWMT and GA and see if your RSS feed is bringing you traffic/referrals at all. If it isn't helping then I think you can put a 410 code for the /feed/ although as pointed out to me, there really isn't much benefit for using the 410 over just letting the 404s die on their own.
Hope that helps a little!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do with all these 404 pages?
I have a website that Im currently working on that has been fairly dormant for a while and has just been given a face lift and brought back to life. I have some questions below about dealing with 404 pages. In Google WMT/search console there are reports of thousands of 404 pages going back some years. It says there are over 5k in total but I am only able to download 1k or so from WMT it seems. I ran a crawl test with Moz and the report it sent back only had a few hundred 404s in, why is that? Im not sure what to do with all the 404 pages also, I know that both Google and Moz recommend a mixture of leaving some as 404s and redirect others and Id like to know what the community here suggests. The 404s are a mix of the following: Blog posts and articles that have disappeared (some of these have good back-links too) Urls that look like they used to belong to users (the site used to have a forum) which where deleted when the forum was removed, some of them look like they were removed for spam reasons too eg /user/buy-cheap-meds-online and others like that Other urls like this /node/4455 (or some other random number) Im thinking I should permanently redirect the blog posts to the homepage or the blog but Im not sure what to do about all the others? Surely having so many 404s like this is hurting my crawl rate?
Technical SEO | | linklander0 -
Xml sitemaps giving 404 errors
We have recently made updates to our xml sitemap and have split them into child sitemaps. Once these were submitted to search console, we received notification that the all of the child sitemaps except 1 produced 404 errors. However, when we view the xml sitemaps in a browser, there are no errors. I have also attempted crawling the child sitemaps with Screaming Frog and received 404 responses there as well. My developer cannot figure out what is causing the errors and I'm hoping someone here can assist. Here is one of the child sitemaps: http://www.sermonspice.com/sitemap-countdowns_paged_1.xml
Technical SEO | | ang0 -
Www vs non www - Crawl Error 902
I have just taken over admin of my company website and I have been confronted with crawl error 902 on the existing campaign that has been running for years in Moz. This seems like an intermittent problem. I have searched and tried to go over many of the other solutions and non of them seem to help. The campaign is currently set-up with the url http://companywebsite.co.uk when I tried to do a Moz manual crawl using this URL I got an error message. I changed the link to crawl to http://www.companywebsite.co.uk and the crawl went off without a hitch and im currently waiting on the results. From testing I now know that if i go to the non-www version of my companies website then nothing happens it never loads. But if I go to the www version then it loads right away. I know for SEO you only want 1 of these URLS so you dont have duplicate content. But i thought the non-www should redirect to the www version. Not just be completely missing. I tried to set-up a new campaign with the defaults URL being the www version but Moz automatically changed it to the non-www version. It seems a cannot set up a new campaign with it automatically crawling the www version. Does it sound like im out the right path to finding this cause? Or can somebody else offer up a solution? Many thanks,
Technical SEO | | ATP
Ben .0 -
How to inform Google to remove 404 Pages of my website?
Hi, I want to remove more than 6,000 pages of my website because of bad keywords, I am going to drop all these pages and making them ‘404’ I want to know how can I inform google that these pages does not exists so please don’t send me traffic from those bad keywords? Also want to know can I use disavow tool of google website to exclude these 6,000 pages of my own website?
Technical SEO | | renukishor4 -
404 Best Practices
Hello All, So about 2 months ago, there was a massive spike in the number of crawl errors on my site according to Google Webmaster tools. I handled this by sending my webmaster a list of the broken pages with working pages that they should 301 redirect to. Admittedly, when I looked back a couple weeks later, the number had gone down only slightly, so I sent another list to him (I didn't realize that you could 'Mark as fixed' in webmaster tools) So when I sent him more, he 301 redirected them again (with many duplicates) as he was told without really digging any deeper. Today, when I talked about more re-directs, he suggested that 404's do have a place, that if they are actually pages that don't exist anymore, then a ton of 301 re-directs may not be the answer. So my two questions are: 1. Should I continue to relentlessly try to get rid of all 404's on my site, and if so, do I have to be careful not to be lazy and just send most of them to the homepage. 2. Are there any tools or really effective ways to remove duplicate 301 redirect records on my .htaccess (because the size of it at this point could very well be slowing down my site). Any help would be appreciated, thanks
Technical SEO | | CleanEdisonInc0 -
174 Duplicate Content Errors
How do I go about fixing these errors? There are all related to my tags. Thank you in advance for any help! Lisa
Technical SEO | | lisarein0 -
When Should I Ignore the Error Crawl Report
I have a handful of pages listed in the Error Crawl Report, but the report isn't actually showing anything wrong with these pages. I am double checking the code on the site and also can't find anything. Should I just move on and ignore the Error Crawl Report for these few pages?
Technical SEO | | ChristinaRadisic0 -
If you add a no follow to a time sensitive link, will it get picked up as broken link 404 in WMT report?
We have a client who publishes deals that are time sensitive. Links to the deals expire and so Google's crawlers are picking them up and finding a 404 If I no follow them, will the 404's still get picked up and reported in WMT? The same question applies to SEOMoz Pro.
Technical SEO | | Red_Mud_Rookie0