Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to remove 404 pages wordpress
-
I used the crawl tool and it return a 404 error for several pages that I no longer have published in Wordpress. They must still be on the server somewhere?
Do you know how to remove them? I think they are not a file on the server like an html file since Wordpress uses databases?
I figure that getting rid of the 404 errors will improve SEO is this correct?
Thanks,
David
-
Yeah...as others have noted, there often is the live link somewhere else that points to a page that is now gone...
So a 404 really is the LINK page....as long as it's out there, it'll point to that non-existant page....so a 301 can help, or (this was fun) you can 301 the incoming 404 link BACK to the linking page itself....
teeHee...yeah, not such a good idea but a tactic that we did have to use about 4 years ago to get a spam directory to "buzz off!!!"

-
Hey David
Once you publish a page/post in WordPress and submit a sitemap, you are stuck with those pages. I've experienced this problem a lot as I use WordPress often. Once you trash a page there and delete it permanently, it's not stored anywhere in the WordPress CMS. They are just reading as 404s since they existed and now no longer exist.
As stated above, just make sure you are not linking to your trashed page anywhere in your site.
I've done a couple things with 404 Pages on my WordPress sites:
1. Make an awesome 404 page so that people will stay on the site if they found your 404 page on accident. Google will eventually stop crawling 404s so this is a good temporary way to engage users.
2. 301 Redirect the 404s to relevant pages. This helps keep your link juice and also helps with the user experience (since they are reaching a relevant page)
Hope that helps!
-
404's are a natural part of websites, Google understands that. As long as you don't have links to pages on your site that are 404'ing you're fine. So basically, just make sure your website is not the source of your 404's.
-
Anything you type after your domain which isn't an actual page will return a not found error; it doesn't mean the page exists somewhere. [Try entering yourdomain.com/anythingyouwant and you will get a 404.] Or am I misunderstanding the question? In any case, 404 errors are not necessarily bad for SEO, as long as they are not harming the user experience.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Need Moz SEO Wordpress Plugin With API
Re: Moz WordPress Plugin? Hi guys,
Moz Pro | | mrezair
I need some Moz SEO Wordpress Plugins For my website working with Moz API. I've already found Moz DA-PA Checker plugin Moz DA-PA Checker But Need SEO Plugins too. Any Suggestion will be appreciated.0 -
Source page showsI have 2 h1 tags on my page. I can only find one.
When I grade my page it says I have more than one h1 tag. I view the source page and it shows there are two h1 headings with the same wording. If I delete the one h1 heading I can find, the page source shows I have deleted both of them. I don't know how to get to the other heading to delete it. And I'm off page one of google! Can anybody help? Clay Stephens
Moz Pro | | Coot0 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
How to Fix 404 Errors
Hey Moz'ers - I just added a new site to my Moz Pro account and when I got the crawl report back there was a ton of 404 errors (see attached). I realize the best way to fix these is to manually go through every single error and see what the issue is... I just don't have time right now, and I don't have a team member that can jump on this either, but realize this will be a huge boost to this client if/when I get these resolved... So my question is: Is there a quicker way to get these resolved? Is there an outsourcing company that can fix my clients errors correctly? Thanks for the help in advance:) wBhzEeV
Moz Pro | | 2Spurs0 -
Canonical URLs all show trailing slash on main site pages - using Yoast SEO for Wordpress - how to correct
We are using Yoast for a number of our sites. We use naked domain as the canonical. I have noticed in the header tags that all our sites show the canonical URLs as having a trailing slash: Example: http;//foxspizzajc.com, when I look at the source code, it shows the canonical as http;//foxspizzajc.com/ Of course, it is much more likely that all sites that link to us will not use the trailing slash - so preferably we do not want that to be the canonical - among other reasons. Does this need to be fixed so the trailing slash is removed? I cannot see how to do this in Yoast SEO or in Permalinks structure for Wordpress. Sorry for my ignorance. Thanks for any help.
Moz Pro | | Adam_RushHour_Marketing1 -
Redirected pages still sending response code 200
SEO Moz tool reports missing title tags on all the links that have been redirected. E.g. this page: http://www.imoney.my/ms/personal-loan When I check the response code on the page with redirect checker it shows code 200 (page exists). Has it happened to anyone else? How can a redirected page send a 200 code?
Moz Pro | | imoney0 -
Duplicate page titles are the same URL listed twice
The system says I have two duplicate page titles. The page titles are exactly the same because the two URLs are exactly the same. These same two identical URLs show up in the Duplicate Page Content also - because they are the same. We also have a blog and there are two tag pags showing identical content - I have blocked the blog in robots.txt now, because the blog is only for writers. I suppose I could have just blocked the tags pages.
Moz Pro | | loopyal0 -
How to check Page Authority in bulk?
Hey guys, I'm on the free trial for SEOmoz PRO and I'm in love. One question, though. I've been looking all over the internet for a way to check Page Authority in bulk. Is there a way to do this? Would I need the SEOmoz API? And what is the charge? All I really need is a way to check Page Authority in bulk--no extra bells and whistles. Thanks, Brandon
Moz Pro | | thegreatpursuit0