Why the sudden increase in soft 404s?
-
I haven't made any changes to my site but in a week I am showing 30-40 soft 404s in Webmaster Tools. This just started happening in the last 2 weeks. When I click to go to the pages they are fine, and even fetch and render works fine on the pages.
-
They have stopped, with no changes to the site. I have no idea why. Thank you for the offer though.
-
Hi EcommerceSite!
Would love to help you figure this out - please PM the URL. Thanks!
-
It seems to have stopped with no changes to the site. I have no idea why.
-
Hi EcommerceSite,
Can't see an answer to your question so far, are you still having issues?
If you want to send me a pm with your url then I'll have a look at this for you.
Tom
-
I can send it in a message.
-
Hi EcommerceSite!
It really sounds like folks will need to check out your site, or at least have a lot more information, in order to give much more advice. Is that something you can share?
-
Loading times have stayed really stable.
There are no 404 errors in either tool.
Using the fetch as Googlebot tool the pages all work fine.
It doesn't make any sense.
-
Hi,
This can occur if Google's crawlers for some reason is not able to reach some of your pages. It could be related to some network issues, or temporary server issues. A few things to look at:
- Do you see any increased loading times for your pages?
- When looking at the page with firebug or chrome's inspector tools - do you see any 404 errors returned
- What result do you get if using the Fetch as Googlebot tool?
Hope this helps
Best regards,Anders
-
It'd be great if you can share what your site is so people can check it out and see if they can figure out what's going on. Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Best to Handle Inherited 404s on Purchased Domain
We purchased a domain from another company and migrated our site over to it very successfully. However, we have one artifact of the original domain in that there was a page that was exploited by other sites on the web. This page allowed you to pass any URL to it and redirect to that URL (e.g. http://example.com/go/to/offsite_link.asp?GoURL=http://badactor.com/explicit_content). This page does not exist on our site so the results always go to a 404 on our site. However, we find that crawlers are still attempting to access these invalid pages. We have disavowed as many of the explicit sites as we can, but still some crawlers come looking for those links. We are considering blocking the redirect page in our robots.txt but we are concerned that the links will remain indexed but uncrawlable. What's the best way to pull these pages from search engines and never have them crawled again? UPDATE: Clarifying that what we're trying to do it get search engines to just never try to get to these pages. We feel the fact they're even wasting their time on getting a 404 is what we're trying to avoid. Is there any reason we shouldn't just block these in our robots.txt?
Intermediate & Advanced SEO | | russell_ms1 -
Realistic expectations to increase domain authority
A) what is a realistic timeline to increase a websites domain authority by 20 points? B) what are the most important factors to increase a websites domain authority?
Intermediate & Advanced SEO | | WebMarkets0 -
Will Reducing Number of Low Page Authority Page Increase Domain Authority?
Our commercial real estate site (www.nyc-officespace-leader.com) contains about 800 URLs. Since 2012 the domain authority has dropped from 35 to about 20. Ranking and traffic dropped significantly since then. The site has about 791 URLs. Many are set to noindex. A large percentage of these pages have a Moz page authority of only "1". It is puzzling that some pages that have similar content to "1" page rank pages rank much better, in some cases "15". If we remove or consolidate the poorly ranked pages will the overall page authority and ranking of the site improve? Would taking the following steps help?: 1. Remove or consolidate poorly ranking unnecessary URLs?
Intermediate & Advanced SEO | | Kingalan1
2. Update content on poorly ranking URLs that are important?
3. Create internal text links (as opposed to links from menus) to critical pages? A MOZ crawl of our site's URLs is visible at the link below. I am wondering if the structure of the site is just not optimized for ranking and what can be done to improve it. THANKS. https://www.dropbox.com/s/oqchfqveelm1q11/CRAWL www.nyc-officespace-leader.com (1).csv?dl=0 Thanks,
Alan0 -
Joomla to Wordpress site migration - thousands of 404s
I recently migrated a site from Joomla to Wordpress. In advance I exported the HTML pages from Joomla using Screaming Frog and did 301 redirects on all those pages. However Webmaster Tools is now telling me (a week after putting the redirects in place) that there are >7k 404s. Many of them aren't HTML pages, just index.php files but I didn't think I would have to export these in my Screaming Frog crawl. We have since done a blanket 301 redirect for anything with index.php in it but Webmaster Tools is still picking them up as 404s. So my question is, what should I have done with Screaming Frog re exporting to ensure I captured all pages to redirect and what should I now do to fix the 404s that Webmaster Tools is picking up?
Intermediate & Advanced SEO | | Bua0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Increasing Search Queries
Recently I had a drop in the over all number of search queries my website was ranking for (about 50%) on October 5th. I did not lose rankings for my target keywords. How can I regain these lost opportunities?
Intermediate & Advanced SEO | | raph39880 -
Why is Google Webmaster Tools reporting a massive increase in 404s?
Several weeks back, we launched a new website, replacing a legacy system moving it to a new server. With the site transition, webroke some of the old URLs, but it didn't seem to be too much concern. We blocked ones I knew should be blocked in robots.txt, 301 redirected as much duplicate data and used canonical tags as far as I could (which is still an ongoing process), and simply returned 404 for any others that should have never really been there. For the last months, I've been monitoring the 404s Google reports in Web Master Tootls (WMT) and while we had a few hundred due to the gradual removal duplicate data, I wasn't too concerned. I've been generating updated sitemaps for Google multiple times a week with any updated URLs. Then WMT started to report a massive increase in 404s, somewhere around 25,000 404s per day (making it impossible for me to keep up). The sitemap.xml has new URL only but it seems that Google still uses the old sitemap from before the launch. The reported sources of 404s (in WMT) don't exist anylonger. They all are coming from the old site. I attached a screenshot showing the drastic increase in 404s. What could possibly cause this problem? wmt-massive-404s.png
Intermediate & Advanced SEO | | sonetseo0 -
Old pages still crawled by SE returning 404s. Better to put 301 or block with robots.txt ?
Hello guys, A client of ours has thousand of pages returning 404 visibile on googl webmaster tools. These are all old pages which don't exist anymore but Google keeps on detecting them. These pages belong to sections of the site which don't exist anymore. They are not linked externally and didn't provide much value even when they existed What do u suggest us to do: (a) do nothing (b) redirect all these URL/folders to the homepage through a 301 (c) block these pages through the robots.txt. Are we inappropriately using part of the crawling budget set by Search Engines by not doing anything ? thx
Intermediate & Advanced SEO | | H-FARM0