Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to remove 404 pages wordpress
-
I used the crawl tool and it return a 404 error for several pages that I no longer have published in Wordpress. They must still be on the server somewhere?
Do you know how to remove them? I think they are not a file on the server like an html file since Wordpress uses databases?
I figure that getting rid of the 404 errors will improve SEO is this correct?
Thanks,
David
-
Yeah...as others have noted, there often is the live link somewhere else that points to a page that is now gone...
So a 404 really is the LINK page....as long as it's out there, it'll point to that non-existant page....so a 301 can help, or (this was fun) you can 301 the incoming 404 link BACK to the linking page itself....
teeHee...yeah, not such a good idea but a tactic that we did have to use about 4 years ago to get a spam directory to "buzz off!!!"
-
Hey David
Once you publish a page/post in WordPress and submit a sitemap, you are stuck with those pages. I've experienced this problem a lot as I use WordPress often. Once you trash a page there and delete it permanently, it's not stored anywhere in the WordPress CMS. They are just reading as 404s since they existed and now no longer exist.
As stated above, just make sure you are not linking to your trashed page anywhere in your site.
I've done a couple things with 404 Pages on my WordPress sites:
1. Make an awesome 404 page so that people will stay on the site if they found your 404 page on accident. Google will eventually stop crawling 404s so this is a good temporary way to engage users.
2. 301 Redirect the 404s to relevant pages. This helps keep your link juice and also helps with the user experience (since they are reaching a relevant page)
Hope that helps!
-
404's are a natural part of websites, Google understands that. As long as you don't have links to pages on your site that are 404'ing you're fine. So basically, just make sure your website is not the source of your 404's.
-
Anything you type after your domain which isn't an actual page will return a not found error; it doesn't mean the page exists somewhere. [Try entering yourdomain.com/anythingyouwant and you will get a 404.] Or am I misunderstanding the question? In any case, 404 errors are not necessarily bad for SEO, as long as they are not harming the user experience.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Is Moz Able to Track Internal Links Per Page?
I am trying to track internal links and identify orphan pages. What is the best way to do this?
Moz Pro | | WebMarkets0 -
Pages with Duplicate Content Error
Hello, the result of renewed content appeared in the scan results in my Shopify Store. But these products are unique. Why am I getting this error? Can anyone please help to explain why? screenshot-analytics.moz.com-2021.10.28-19_53_09.png
Moz Pro | | gokimedia0 -
How crucial are H1 tags and descriptions in wordpress categories?
Hi all Trying to improve SEO for my (mostly) local site, www.nectarbridge.com, and recently got back on Moz Pro account. First crawl of my site by Moz, a manageable number of issues that I've mostly sorted, but the category with the largest number of problems is missing or invalid tags. My content pages and blog posts are not missing the tags. It's category, archives, etc., including multiple pages, ex: https://www.nectarbridge.com/category/blog/page/4/ A smaller number of pages are being flagged by Moz as missing descriptions, and they are also category pages and the like. So the question is - how hard should I pursue fixing these issues? I'm using the divi theme, which apparently doesn't display the category description by default (if it did, that would kill two birds with one stone). There is a fix to add the category description, but before I get into that I'm trying to discern whether this issue really matters greatly to SEO or if I should spend that time just working on more content.
Moz Pro | | gary_nectarbridge0 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
Tool recommendation for Page Depth?
I'd like to crawl our ecommerce site to see how deep (clicks from home page) pages are. I want to verify that every category, sub-category, and product detail page is within three clicks of the home page for googlebot. Suggestions? Thanks!
Moz Pro | | Garmentory0 -
How long for authority to transfer form an old page to a new page via a 301 redirect? (& Moz PA score update?)
Hi How long aproximately does G take to pass authority via a 301 from an old page to its new replacement page ? Does Moz Page Authority reflect this in its score once G has passed it ? All Best
Moz Pro | | Dan-Lawrence
Dan3 -
Domain / Page Authority - logarithmic
SEOmoz says their Domain / Page Authority is logarithmic, meaning that lower rankings are easier to get, higher rankings harder to get. Makes sense. But does anyone know what logarithmic equation they use? I'm using the domain and page authority as one metric in amongst other metrics in my keyword analysis. I can't have some metrics linear, others exponential and the SEOmoz one logarithmic.
Moz Pro | | eatyourveggies0 -
How to check Page Authority in bulk?
Hey guys, I'm on the free trial for SEOmoz PRO and I'm in love. One question, though. I've been looking all over the internet for a way to check Page Authority in bulk. Is there a way to do this? Would I need the SEOmoz API? And what is the charge? All I really need is a way to check Page Authority in bulk--no extra bells and whistles. Thanks, Brandon
Moz Pro | | thegreatpursuit0