GWT Soft 404 count is climbing. Important to fix?
-
In GWT I am seeing my mobile site's soft 404 count slowly rise from 5 two weeks ago to over 100 as of today. If I do nothing I expect it will continue to rise into the thousands. This is due to there being followed links on external sites to thousands of discontinued products we used to offer. The landing page for these links simply says the product is no longer available and gives links to related areas of our site.
I know I can address this by returning a 404 for these pages, but doing so will cause these pages to be de-indexed. Since these pages still have utility in redirecting people to related, available products, I want these pages to stay in the index and so I don't want to return a 404.
Another way of addressing this is to add more useful content to these pages so that Google no longer classifies them as soft 404. I have images and written content for these pages that I'm not showing right now, but I could show if necessary.
But before investing any time in addressing these soft 404s, does anyone know the real consequences of not addressing them? Right now I'm getting 275k pages indexed and historically crawl budget has not been an issue on my site, nor have I seen any anomalous crawl activity since the climb in soft 404s began. Unchecked, the soft 404s could climb to 20,000ish. I'm wondering if I should start expecting effects on the crawl, and also if domain authority takes a hit when there are that many soft 404s being reported.
Any information is appreciated.
-
Thank you for your responses!
-
I think I have to agree with Patrick. Whenever you have soft 404s on your website the best way to deal with them is to think about users and then about search engines.
Again as Patrick said, if the products are gone forever, redirect the pages to most relevant product and pass the link juice to other pages as well so that right kind of product started to get ranked. If the product are temporarily unavailable my advice is to show the product details, images with a tag about out of stock and a popup may be that guide to other parts of the website.
Whatever you use, think about your users and if they are comfortable, just go with it and I am sure search engine will follow you accordingly as at the end of the day what search engine want is the happy user.
Will soft 404 hurt? In my opinion few soft 404s on the website should not hurt but again as you said it might grow in thousands, in that case this is can be a problem and can affect your SERP rankings.
Hope this helps!
-
Hi,
Agree with Patrick. Some additional info & resources:
It's better to avoid 'soft 404's' - in fact Google prefers "real" 404 pages (which are are an ok strategy in case of a large number of out of stock products).
You could check this article with the different options according to Matt Cutts (http://searchenginewatch.com/sew/news/2334932/ecommerce-seo-tips-for-unavailable-products-from-googles-matt-cutts- Small site: show alternative products instead of 404
- Medium site: 404 is ok - best to have custom version
- Large site: consider using 'unavailable after' tag
Are soft 404's going to hurt your site: check this answer from Cyrus Shepard on a similar question (http://moz.com/community/q/soft-404s-for-unpublished-301-d-content#reply_291233
"How much will it hurt you? Probably not much, but it's hard to say.
Let's ask these questions:
- How much traffic goes to these pages? If not much, is it okay to 404 them?
- Are there more relevant pages you could redirect these to? (ideally, something with a similar title as the original page?)
- Have you seen much traffic loss overall? If not, it's likely this isn't hurting you."
Hope this helps
Dirk
-
Hi there
When you have these issues, you have options. For instance, if the product is never coming back, you could redirect the page to a relevant product or category, remove the page from your sitemap, and change internal links, or simply create a custom 404 page.
But, if the product is coming back and temporarily unavailable, you could create an out of stock message to let users know the product isn't available at the moment (and will be back) but there's a whole site to explore and check out other products. But again, you have to keep the user in mind with errors on your site.
You can read more about Domain Authority here - although it's not directly stated, I am sure having errors all over you site (especially almost 10% of it) could be bad for it - although it's not at the point right now.
I would start there - what products are legitimately gone? What ones are just out of stock? Can you redirect these to their categories? Are users landing on these pages a lot?
I would also take a look at your backlink profile in Majestic and see which links are worth correcting or removing.
This is upto you to prioritize these issues and remedy accordingly. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 or rel="canonical" for empty search results?
We have search on our site, using the URL, so we might have: example.com/location-1/service-1, or example.com/location-2/service-2. Since we're a directory we want these pages to rank. Sometimes, there are no search results for a particular location/service combo, and when that happens we show an advanced search form that lets the user choose another location, or expand the search area, or otherwise help themselves. However, that search form still appears at the URL example.com/location/service - so there are several location/service combos on our website that show that particular form, leading to duplicate content issues. We may have search results to display on these pages in the future, so we want to keep them around, and would like Google to look at them and even index them if that happens, so what's the best option here? Should we rel="canonical" the page to the example.com/search (where the search form usually resides)? Should we serve the search form page with an HTTP 404 header? Something else? I look forward to the discussion.
Technical SEO | | 4RS_John1 -
Ignore these external links reported in GWT?
Taking a long, Ace-Ventura-like breath here. This question is loaded. Here we go: No manual actions reported against my client's site in GWT HOWEVER, a link: operator search for external links to my client's website shows NO links in results. That seems like a very bad omen to me. OSE shows 13 linking domains, but not the one that's listed in the next bullet point. Issue: In Google Webmaster Tools, I noticed 1,000+ external links to my client's website all coming from riddim-donmagazine.com (there are a small handful of other domains listed, but this one stuck out for the large quantity of links coming from this domain) Those external links all point to two URLs on my client's website. I have no knowledge of any campaigns run by client that would use this other domain (or any schemes for that matter) It appears that this website riddim-donmagazine.com has been suspended by hostgator All of the links were first discovered last year (dates vary but basically August through December 2013) There have not been any newly discovered links from this website reported by GWT since those 2013 dates All of the external links are /? based. Example: http://riddim-donmagazine.com/?author=1&paged=31 If I run that link in preceding bullet point through http://www.webconfs.com/http-header-check.php, or any others from riddim-donmagazine.com those external links return 302 status. My best guess is at one time the client was running an advertising program and this website may have been on that network. One of the external links points to an ad page on the client's website.
Technical SEO | | EEE3
(web.archive.org confirms this is a WordPress site and that it's coverage of Bronx news could trigger an ad for my client or make it related to my client's website when it comes to demographics.) Believe me, this externally linked domain is only a small problem in comparison with the rest of my client's issues (mainly they've changed domains, then they changed website vendors, etc., etc.), but I did want to ask about that one externally linked domain. Whew.Thanks in advance for insights/thoughts/advice!0 -
Why would Google rank a highly irrelevant page in the top 15 especially for a seemingly important keyword?
While searching for "Blog writing service reviews", I found that a web page that's not even optimized for the query is ranking within top 15 search results. Upon checking the source code, I found that the webpage has been optimized for product reviews services. Plus, the website is only 11 months old, got 7 digit Alexa rank and has PR 1. Why would Google rank such a page in top 15?
Technical SEO | | suskanchan0 -
Duplicate Title and Content. How to fix?
So this is the biggest error I have. But I don't know how to fix it. I get that I have to make it so that the duplicats redirect to the source, but I don't know how to do that. For example, this is out of our crawl diagnostic: | On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3 1 1 0 On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3&s=8d631e0ac09b7a462164132b60433f98 | 1 | 1 | 0 | That's just an example. But I have over 1000+ like that. How would I go about fixing that? Getting rid of the "&s=8d631e0ac09b7a462164132b60433f98"? I have godaddy as my domain and web hoster. Could they be able to fix it?
Technical SEO | | taychatha0 -
GWT, URL Parameters, and Magento
I'm getting into the URL parameters in Google Webmaster Tools and I was just wondering if anyone that uses Magento has used this functionality to make sure filter pages aren't being indexed. Basically, I know what the different parameters (manufacturer, price, etc.) are doing to the content - narrowing. I was just wondering what you choose after you tell Google what the parameter's function is. For narrowing, it gives the following options: Which URLs with this parameter should Googlebot crawl? <label for="cup-crawl-LET_GOOGLEBOT_DECIDE">Let Googlebot decide</label> (Default) <label for="cup-crawl-EVERY_URL">Every URL</label> (the page content changes for each value) <label style="color: #5e5e5e;" for="cup-crawl-ONLY_URLS_WITH_VALUE">Only URLs with value</label> ▼(may hide content from Googlebot) <label for="cup-crawl-NO_URLS">No URLs</label> I'm not sure which one I want. Something tells me probably "No URLs", as this content isn't something a user will see unless they filter the results (and, therefore, should not come through on a search to this page). However, the page content does change for each value.I want to make sure I don't exclude the wrong thing and end up with a bunch of pages disappearing from Google.Any help with this is greatly appreciated!
Technical SEO | | Marketing.SCG0 -
Anyone know how to fix duplicate content and titles with news section?
We use django for out site and it's working really well, but we're having an issue with duplicate titles and content via the news section. The news is basically stories sourced from other sites and we link to them via our news section. I'm not sure how to fix the duplicate title issue in this case. I noticed people recommend archiving or using a canonical, but because the news section is set up how it is I don't think that would work. Does anyone have a way around this?A
Technical SEO | | KateGMaker0 -
404 vs 301
My company is planning on discontinuing one of the product lines we currently offer. In terms of SEO, would it be better to implement a 301 redirect to a generic page page (such as the homepage or main product page), or to create a custom 404 page explaining that the product line with links to other pages (according to the most next viewed pages in Google Analytics). Thanks!
Technical SEO | | theLotter0 -
Used SEOMOZ top 100 Directories, my site ranking lowered, what can we do to fix this?
We have made a big mistake.... So what can we do to fix this? A trainee member of staff has used the seomoz 100 top directories and added to sites from PR10 to PR6 approx about 25 sites, using keywords were possible instead of using the website URL "which i now was stupid!. Our website ranking have been lowered big time for all keywords used!, eg from 1st to 10th and even disappeared from the top 100 We are contacting all directories asking for the Title link to be changed to the URL instead of a keyword.. Will this help? I understand that Google give sites a penalty for this!!, but what can i do to put this right and how long would this penalty last for? Any advice would be highly appreciated... Thanks Dean
Technical SEO | | deanpallatt0