Increasing in 404 errors that doesnt exist
-
Hi
First of all I should say that I have error in the old webmaster not the new one.
I have two WordPress blog in root and subfolder.
Today I checked my webmaster and recognize that I had 100 errors (404) that found in few days ago.
My root WordPress is OK but subfolder WordPress has error. Let me show you by example.
http://example.com/subfolder/article15245
I had error for this page:
http://example.com/article15245
looks like this subfolder deleted
I checked my links, but all of them were OK and linked to the right URL.
unfortunately this errors dont have "linked from" section
-
Thanks for your reply
Today I redirect most of these links to the right post, but it was such a borning task.
-
The page by page redirection is necessary to preserve any link juice from any incoming links to the pages in question. You can throw the links into Ahrefs or look at them in Moz to see if the pages have any links that are worth saving. Also, SEO press has the ability to edit the HTACCESS built into the plugin. But you can also do wordpress level redirects as well. It's pretty awesome.
Hope that helps.
-
You can put the domain here, I'm sure lots of people would like to weigh in on this
it's an interesting problem
I have replied to your email
-
Redirect one by one? its so boring!
Does The section "linked from" updates for these links?
I use redirection plugin instead of htaccess. Its more safe
What happen if I don't redirect them for recognizing why these errors occur?
Thanks for your respond
-
Thanks for your perfect answer.
I checked these links in moz link explorer but no link found. i think this is an internal problem because most of my subfolder links (over 70%) become 404.
I have redirection plugin. It has 404 sections that shows last visitors that gone to the 404 pages but no reports like this error found!
As you said it seems I should redirect them with .htaccess
Thanks , I emailed my domain for you.
Can I put my domain here for others to check?
-
I would just 301 all the pages to the final URLs in prod, verify that they are working individually, then Fetch & Render. Many plugins like SEO press or Yoast will allow you to upload them in bulk to help save time. Or you can always update your HTACCESS file with the redirect. If you are working in Excel or Sheets, using the Find/Replace to bulk edit can be a life saver. It is usually pretty boring, but not the worst in the world. Cheers!
-
It's so annoying when things like that happen! When Google refuses to give the 'linked from' data, it's a real head-test working out where the links are coming from. Did you know that the links could even be coming from other websites, not just your own? When a user follows a link to your site (regardless of where that link is from), Google consider it your error if a valid page isn't returned
Since this error is only occurring in the old area of WMT, it probably doesn't matter much. That being said, one simple fix would be to 301 redirect all the broken links, to the functional article pages. After that you can just bulk mark them all as fixed
Usually I tell people to fix the actual link, but if it's an external link which you have no control over (or if Google can't even be bothered to tell you what the linking page is) then 301 and mark as fixed is probably your best bet. Especially since, these are only individual article pages (it's not like a malformed version of your homepage or something)
If you email me the domain (check my profile page) then I might be able to crawl your site for you to determine whether there are any obviously broken internal links. Regardless, you'd want the 301s as a back-stop anyway
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you increase the page footprint, so more links appear on search result?
We want to have more website links to appear in SERP. How to do that?
On-Page Optimization | | WalterHalicki0 -
Why would changing 404 pages increase traffic by 9%?
Neil Patel claimed in this article that by creating a custom 404 page that links out to 25 to 50 random internal pages on the website, he was able to increase the traffic of Techcrunch by 9%. I'm a bit skeptical about this claim. A couple of questions: Is this theory sound? If you've personally tried this or have read other articles supporting Neil, I'd love to learn more. Would a big site like Techcrunch really have problems with Google not indexing all of its pages? Also, does getting more pages crawled help you get more traffic? Specifically, would it help a site like mine? For reference, my site gets an average of 12,040 pages crawled per day in last 90 days. Currently 28,922 pages have been indexed. Are there any possible downsides to trying this? Thanks!
On-Page Optimization | | Brand_Psychic0 -
How to Fix Google Webmasters Soft 404 Errors for Wordpress Site?
I am getting Soft 404 errors in my Google Webmasters Tools don't know how to fix it the site in on WordPress CMS the site is "http://appdictions.com/" i am getting errors in the http://appdictions.com/members section...suggestion to fix the issue will be appreciated..
On-Page Optimization | | preferati0 -
Duplicate Page Titles in Crawl Errors (although Google is rewriting in serps ??)
Hi Im working on a client/project and crawl report is showing thousands of dupe page titles In the case of the blog/news section its aprox 50 since aprox 50 posts and they all have the same meta-title: "Brand News | Brand" as opposed to: "Title Unique to Page/Topic/KW Relating to Content | Brand" Since these are the main content pages we want to rank (in addition to the main site category pages) then i have instructed dev must prioritise populating these pages meta-titles with the actual post/article titles, as per the latter version of the above example. (I should mention that i have requested they fix all dupe titles but main content pages are the priority). Whilst this will reduce the number of dupe titles in crawl error/warning report which is a good thing, is it actually likely to increase the ranking of these news/content pages given that Google does seem to be rewriting the titles correctly in the serps based on the page content ? Many Thanks in advance for your input
On-Page Optimization | | Dan-Lawrence0 -
Dealing with a 404
Hi there, I have an error on one of my campaigns. It says that it gets a 404on this page: http://www.datasat.com/tetra/white-paper.htmlWEhjdAfgkh However, I cannot replicate the above URL as it doesn't exist on the site. The end of the URL has some spurious characters which I don't know how they got there. Has anyone any ideas about what's happening and how I can sort it? Many thanks
On-Page Optimization | | iain0 -
Why so many crawl errors?
Our site is showing it has a ton of crawl errors in the back end, mostly concerning duplicate content within our blog. The content is unique however. We know this for certain because it's done in house or put together by some of the freelance writers we work with. The site is for an RV dealership and we're using a template-based system from a well known company. Any ideas on what may be causing this?
On-Page Optimization | | BlakeArbogast0 -
Tag clouds: good for internal linking and increase of keyword relevant pages?
As Matt Cutts explained, tag clouds are OK if you're not engaged in keyword stuffing (http://www.youtube.com/watch?v=bYPX_ZmhLqg) - i.e. if you're not putting in 500 tags. I'm currently creating tags for an online-bookseller; just like Amazon this e-commerce-site has potentially a couple of million books. Tag clouds will be added to each book detail page in order to enrich each of these pages with relevant keywords both for search engines and users (get a quick overview over the main topics of the book; navigate the site and find other books associated with each tag). Each of these book-specific tag clouds will hold up to 50 tags max, typically rather in the range of up to 10-20. From an SEO perspective, my question is twofold: 1. Does the site benefit from these tag clouds by improving the internal linking structure? 2. Does the site benefit from creating lots of additional tag-specific-pages (up to 200k different tags) or can these pages become a problem, as they don't contain a lot of rich content as such but rather lists of books associated with each tag? Thanks in advance!
On-Page Optimization | | semantopic0 -
How do I fix a 404 with a 301
I understand the need for fixing 404's but I have yet to have a serious walkthrough of how to set up a 301. From all the talk on the forums and such I'm pretty sure this is easy but I've just never done it before and I could really use a walk through. Thanks
On-Page Optimization | | BenRWoodard1