Soft 404s for unpublished & 301'd content
-
Hi,
One site I work with unpublished a lot of thin content. Great idea, right?
These unpublished pages were then 301'd up to the main category page that they previously existed in.
Now Google Webmaster Tools calls them out as soft 404 errors. This seems unexpected since the pages
were 301'd. Here is my question; Is this a serious problem that may affect the site's overall organic results
and if so what should I do about it?
Thanks... Darcy
-
Short answer: create a custom 404 page, not just for these pages, but one that can show for everypage on your site.
A few resources:
https://support.google.com/webmasters/answer/93641?hl=en
Example: http://moz.com/sadfklfadsadfjs
-
Cyrus, thanks for hanging in there with my questions. If I just give back a 404, what am I showing them on the page?
I would think seeing the main questions page would be better than just sitting at the original url and looking at 404 page notice - seems like a bad user experience if Google wants to get all user-experiency about it.
Thanks... Darcy
-
Yes, it's possible, but that could be considered cloaking. I'd say best to return a 404.
-
Hi Cyrus,
Have not experienced a dip, but things have been a little static.
Can you do both... forward the page and give back a 404?
What would you do?
Thanks... Darcy
-
Yes, I would think that at the point Google crawls it and finds it forwarded it would drop it from the index and not waste resources crawling it again unless linked somewhere. I will keep an eye out for links, but don't believe that there are any.
Thanks, Dirk... Darcy
-
In that case, sounds like you should either:
- 404 them if you have evidence these have hurt your rankings/traffic (have you experienced a dip?)
- Ignore them and go about your day
-
Hi Cyrus,
Thanks for the info. These are forum pages where no one ever answered the question, so
there is no helpful info and very little content.
They were forwarded up to the main questions page (one / up the url structure).
The page they were forwarded to is like a questions category page, not specific to the subject of the
forwarded page. These forwarded pages don't get much/any traffic because they never ranked
and we didn't promote them.
If it doesn't hurt overall search on other pages, I'd rather not go to the substantial effort of finding subject-relevant pages to forward to, since no one will ever go to the original url and need to see something super relevant.
Your thoughts? Thanks! Best... Darcy
-
If Fetch like Google is also giving a 301 - I would mark them as solved in WMT & check if they re-appear.
If you click on the i next to the redirect message in Fetch like Google - it shows the type of redirect & the page it's redirecting to. I assume you checked that this is also a 301.I have a similar issue on one of my sites - if a user gets to a non-existing url - the server first tries to find out if the page exists - if it doesn't it's redirected to a 404 page. Although technically it is a 301 - WMT sees them as a soft 404 as the destination page is a "Page not found" type of page (called 404.php) - which (quite ironically) renders a 200 status.
On the destination page - do you mention somewhere a message like "page not found" or is it just a plain category page?
The SEO impact is difficult to assess - Google says these pages are mainly wasting the bot's time as it's indexing pages that do no longer exist, not sure if it is also affecting rankings. As you did the crawl with Screaming Frog, I guess you are also removing all internal links to these redirected pages? If these links disappear, and as the content was thin, I suspect you don't have many external links pointing to them, so the problem should disappear after a while.
rgds,
Dirk
-
If Google thinks the 301 leads to a page that isn't relevant enough, they may flag it as a "soft 404" even though it returns a 301. That's Google's way of saying they think you should 404 these pages instead.
How much will it hurt you? Probably not much, but it's hard to say.
Let's ask these questions:
- How much traffic goes to these pages? If not much, is it okay to 404 them?
- Are there more relevant pages you could redirect these to? (ideally, something with a similar title as the original page?)
- Have you seen much traffic loss overall? If not, it's likely this isn't hurting you.
Hope this helps! Best of luck with your SEO.
-
Okay, that is extra weird. It could be that GWT hasn't update your information since you made the changes. Since everywhere else is telling it's correct -- especially the fetch tool -- then you should wait a few more days and see if it updates.
-
Hi Erica,
I'm saying that the only place it shows a soft 404 is in GWT errors. Screaming Frog, web-sniffer and now Fetch As Google In GWT, all show them as 301 re-directs. I can't re-direct them more than they are. So, is GWT just goofy?
Thanks... Darcy
-
Hi Darcy,
Yeah, if it's still showing as a soft 404, there's still something wrong. I'd try using fetch and render as Google bot and see what happens.
Best of luck!
-
Hi Dirk,
Thanks for the suggestion. As noted above, I put the whole list thru screaming frog and a few thru your suggestion of web-sniffer.net.
95% of the whole list is 301s and 100% of the few put one at a time thru web-sniffer come back as 301s.
My question remains "Is this a serious problem that may affect the site's overall organic results
and if so what should I do about it?"
Thanks... Darcy
-
Hi Erica,
I put the list through screaming frog and 95% of the urls are shown as 301s.
Do you think screaming frog has it right or is there something they wouldn't catch?
Thanks... Darcy
-
Maybe an obvious question but did you check that the url's are indeed properly redirected - checking them with 'Fetch like Google' in WMT or by using a tool like web-sniffer.net?
rgds,
Dirk
-
I'd check to make sure your 301s were done correctly. If they are showing up as soft 404s, they are probably implemented wrong.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How necessary is it to disavow links in 2017? Doesn't Google's algorithm take care of determining what it will count or not?
Hi All, So this is a obvious question now. We can see sudden fall or rise of rankings; heavy fluctuations. New backlinks are contributing enough. Google claims it'll take care of any low quality backlinks without passing pagerank to website. Other end we can many scenarios where websites improved ranking and out of penalty using disavow tool. Google's statement and Disavow tool, both are opposite concepts. So when some unknown low quality backlinks are pointing and been increasing to a website? What's the ideal measure to be taken?
Intermediate & Advanced SEO | | vtmoz0 -
301 redirect to avoid duplicate content penalty
I have two websites with identical content. Haya and ethnic Both websites have similar products. I would like to get rid of ethniccode I have already started to de-index ethniccode. My question is, Will I get any SEO benefit or Will it be harmful if I 301 direct the below only URL’s https://www.ethniccode/salwar-kameez -> https://www.hayacreations/collections/salwar-kameez https://www.ethniccode/salwar-kameez/anarkali-suits - > https://www.hayacreations/collections/anarkali-suits
Intermediate & Advanced SEO | | riyaaaz0 -
Identifying Duplicate Content
Hi looking for tools (beside Copyscape or Grammarly) which can scan a list of URLs (e.g. 100 pages) and find duplicate content quite quickly. Specifically, small batches of duplicate content, see attached image as an example. Does anyone have any suggestions? Cheers. 5v591k.jpg
Intermediate & Advanced SEO | | jayoliverwright0 -
Can I have multiple 301's when switching to https version
Hello, our programmer recently updated our http version website to https. Does it matter if we have TWO 301 redirects? Here is an example: http://www.colocationamerica.com/dedicated_servers/linux-dedicated.htm 301 https://www.colocationamerica.com/dedicated_servers/linux-dedicated.htm 301 https://www.colocationamerica.com/linux-dedicated-server We're getting pulled in two different directions. I read https://moz.com/blog/301-redirection-rules-for-seo and don't know if 2 301's suffice. Please let me know. Greatly appreciated!
Intermediate & Advanced SEO | | Shawn1240 -
Duplicate Content
Hi, So I have my great content (that contains a link to our site) that I want to distribute to high quality relevant sites in my niche as part of a link building campaign. Can I distribute this to lots of sites? The reason I ask is that those sites will then have duplicate content to all the other sites I distribute the content to won;t they? I this duplication bad for them and\or us? Thanks
Intermediate & Advanced SEO | | Studio330 -
Will implementing a 'Scroll to Div Anchor' cause a duplicate content issue?
I have just been building a website for a client with pages that contain a lot of text content. To make things easier for site visitors I have created a menu bar that sticks to the top of the page and the page will scroll to different areas of content (i/e different Div id anchors) Having done this I have just had the thought that this might inadvertently introduce duplicate content issue. Does anyone know if adding an #anchor to the end of a url will cause a duplicate content error in google? For example, would the following URLs be treated as different:- http://www.mysite.co.uk/services
Intermediate & Advanced SEO | | AdeLewis
http://www.mysite.co.uk/services#anchor1
http://www.mysite.co.uk/services#anchor2
http://www.mysite.co.uk/services#anchor3
http://www.mysite.co.uk/services#anchor4 Thanks.0 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0