Why the sudden increase in soft 404s?
-
I haven't made any changes to my site but in a week I am showing 30-40 soft 404s in Webmaster Tools. This just started happening in the last 2 weeks. When I click to go to the pages they are fine, and even fetch and render works fine on the pages.
-
They have stopped, with no changes to the site. I have no idea why. Thank you for the offer though.
-
Hi EcommerceSite!
Would love to help you figure this out - please PM the URL. Thanks!
-
It seems to have stopped with no changes to the site. I have no idea why.
-
Hi EcommerceSite,
Can't see an answer to your question so far, are you still having issues?
If you want to send me a pm with your url then I'll have a look at this for you.
Tom
-
I can send it in a message.
-
Hi EcommerceSite!
It really sounds like folks will need to check out your site, or at least have a lot more information, in order to give much more advice. Is that something you can share?
-
Loading times have stayed really stable.
There are no 404 errors in either tool.
Using the fetch as Googlebot tool the pages all work fine.
It doesn't make any sense.
-
Hi,
This can occur if Google's crawlers for some reason is not able to reach some of your pages. It could be related to some network issues, or temporary server issues. A few things to look at:
- Do you see any increased loading times for your pages?
- When looking at the page with firebug or chrome's inspector tools - do you see any 404 errors returned
- What result do you get if using the Fetch as Googlebot tool?
Hope this helps
Best regards,Anders
-
It'd be great if you can share what your site is so people can check it out and see if they can figure out what's going on. Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old URLs that have 301s to 404s not being de-indexed.
We have a scenario on a domain that recently moved to enforcing SSL. If a page is requested over non-ssl (http) requests, the server automatically redirects to the SSL (https) URL using a good old fashioned 301. This is great except for any page that no longer exists, in which case you get a 301 going to a 404. Here's what I mean. Case 1 - Good page: http://domain.com/goodpage -> 301 -> https://domain.com/goodpage -> 200 Case 2 - Bad page that no longer exists: http://domain.com/badpage -> 301 -> https://domain.com/badpage -> 404 Google is correctly re-indexing all the "good" pages and just displaying search results going directly to the https version. Google is stubbornly hanging on to all the "bad" pages and serving up the original URL (http://domain.com/badpage) unless we submit a removal request. But there are hundreds of these pages and this is starting to suck. Note: the load balancer does the SSL enforcement, not the CMS. So we can't detect a 404 and serve it up first. The CMS does the 404'ing. Any ideas on the best way to approach this problem? Or any idea why Google is holding on to all the old "bad" pages that no longer exist, given that we've clearly indicated with 301s that no one is home at the old address?
Intermediate & Advanced SEO | | boxclever0 -
Sessions increase but pageviews decreased
I have compared my traffic from Jan-15 to Dec-15 and found traffic increased but pageviews decreased on few pages. Is this any issue?
Intermediate & Advanced SEO | | vivekrathore0 -
Lot of duplicate content and still traffic is increasing... how does it work?
Hello Mozzers, I've a dilemma with a client's site I am working on that is make me questioning my SEO knowledge, or the way Google treat duplicate content. I'll explain now. The situation is the following: organic traffic is constantly increasing since last September, in every section of the site (home page, categories and product pages) even though: they have tons of duplicate content from same content in old and new URLs (which are in two different languages, even if the actual content on the page is in the same language in both of the URL versions) indexation is completely left to Google decision (no robots file, no sitemap, no meta robots in code, no use of canonical, no redirect applied to any of the old URLs, etc) a lot (really, a lot) of URLs with query parameters (which brings to more duplicated content) linked from the inner page of the site (and indexed in some case) they have Analytics but don't use Webmaster Tools Now... they expect me to help them increase even more the traffic they're getting, and I'll go first on "regular" onpage optimization, as their title, meta description and headers are not optimized at all according to the page content, but after that I was thinking on fixing the issues with indexation and content duplication, but I am worried I can "break the toy", as things are going well for them. Should I be confident that fixing these issues will bring to even better results or do you think is better for me to focus on other kind of improvements? Thanks for your help!
Intermediate & Advanced SEO | | Guybrush_Threepw00d0 -
How to fix Invalid Product Page registering as Soft 404
Somehow with our site architecture Google is crawling URLS for products we no longer carry (there are no links to those pages so I am still trying to figure out how Google is finding them).Those URLS are being redirected to our invalid product page. That invalid product page is returning a 200 OK code, but according to Google it should be a 404 so we get a soft 404 error. Google is seeing all of the URLs that redirect to that page as soft 404's as well. The first solution I can think of is to create a custom 404 page that looks just like our site, says we don't have the page/product they are looking for, has a search bar, sends a 404 code, etc. Is this the right way to go? And it will probably take some time to implement so is there a quick fix we could do first?
Intermediate & Advanced SEO | | ntsupply0 -
Why the sudden link drop?
A the end of November I am showing that our total links were 118k. Current links are 22k. We changed sites early November so that was about three weeks before. What would cause the drop of about 100k links? Or where should I start investigating?
Intermediate & Advanced SEO | | EcommerceSite0 -
Huge spike in 404s and 500 erros
I'm curious what might cause an inordinate amount of 404s in the reporting from SEOMoz's dashboard. I'm exploring links that are marked as 404s and they are (for the most part) working. I talked with the sysadmin and there were no outages this weekend. We also had a number of 500 errors reported in Webmaster Tools but everything seems to be up. Any ideas?
Intermediate & Advanced SEO | | SystemIDBarcodes0 -
Will Linking To "Offical Sites" Increase My SEO?
I own a movie trailer website. (Where you can watch movie trailers) Will having links on each page that are for "offical website" of each movie, increase my SEO?
Intermediate & Advanced SEO | | rhysmaster0 -
Does Google penalize for having a bunch of Error 404s?
If a site removes thousands of pages in one day, without any redirects, is there reason to think Google will penalize the site for this? I have thousands of subcategory index pages. I've figured out a way to reduce the number, but it won't be easy to put in redirects for the ones I'm deleting. They will just disappear. There's no link juice issue. These pages are only linked internally, and indexed in Google. Nobody else links to them. Does anyone think it would be better to remove the pages gradually over time instead of all at once? Thanks!
Intermediate & Advanced SEO | | Interesting.com0