Can soft 404's hurt my rankings?
-
This post mainly pertains to soft 404's but I recently dropped a few ranks in my main keyword which I have maintained prior to this my better rank for well over 2 years. I participate in NO BLACKHAT and obtain links naturally. I want to describe a few issues that happened prior to my ranking dropping and see what you guys think.
I started to receive about a week prior to my ranks dropping DNS issues with GWT. It was weird because when I would use Goole Fetch on those pages they would return just fine so I was not sure what was happening there.
I use Google page speed server which did in fact decrease my load time (YEAH!!) so that was cool. About 1 week prior to my rankings dropping I enabled godaddy's Website Accelerator as well thinking that could help even more. Because of this I thought maybe this had something to do with my DNS issues with Google so I decided to turn off my website accelerator with Godaddy and just leave my Google pagespeed service on. I figure I don't need 2 of them anyways IMO.
Also at the same time I started to receive a ton (31,000+) html errors with duplicate metadescriptions and titles. I discovered I had an error with my code which was displaying 2 different sets of descriptions and titles for each of these pages. I since then have fixed the issue and waiting for Google to index those pages.
Here is were I think I might have been hurting from the drop in rankings. Some months ago (maybe 2) I decided to redirect my 404's to my homepage. Yes I know this is not good now and I have created a proper 404 page which returns the 404 code. I recently started getting a ton of Soft 404 errors in GWT which is what brought my attention to this issue.
My question is, could my action of redirecting my users to my homepage as a 404 which obviously was returning a 200 on a page that did not exist be possibly the culprit to my ranks dropping?
-
Thanks very helpful article, and BTW AWESOME AVATAR
-
I forgot to add this reference about the 404's
http://googlewebmastercentral.blogspot.com/2008/08/farewell-to-soft-404s.html
-
Quick response- yes a large portion of soft 404's is not going to make google happy. It can be confusing to users and google knows that so they don't like it for pages high in their results.
the real question is not can soft 404's hurt, it is was it the soft 404's or the duplicate content issue on a grand scale. I would be most worried about the dup titles on that large of a scale.
My opinion yea soft 404's can hurt you, but probably not as much as 31k in duplicate titles.
sounds like you addressed both issues, nothing to do but fetch as google submit w links and wait for the indexing to happen.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking & Visibility suddenly Down for my website
Any google updates or, can anyone tell me what google running, web ranking down, majority keywords showing on 1st pages right now those are on 3rd page, so is there any specific reason, or any solutions plz.. this s my website www.iqlance.com can anyone help me to audit and what was the issues facing exactly..
On-Page Optimization | | harry5501 -
Telling Google SERP's my correct currency.
Hi, I'm having a problem with Google SERP results showing my currency as USD, when it should be CAD. An example of a page with this problem is - http://www.absoluteautomation.ca/fgd400-sensaphone400-p/fgd400.htm - can anyone see where Google is getting USD from on there? I don't see it anywhere in the coding. Thanks in advance!
On-Page Optimization | | absoauto0 -
Rankings dropped overnight
We have been doing some optimization for a client. They were not ranked for many keywords before we started, and up until last week 10 were on page 1, with the the last few following closely behind. Everything was moving in the right direction. In this weeks ranking report every one has dropped significantly and some 30 places in one week. We have made no significant changes to the site in the last few weeks. The only issue we picked up on was the slow site speed on their current host 1&1. Any ideas what could have caused the site to have lost ground on all rankings so significantly?
On-Page Optimization | | FLDESIGN0 -
Can you use multiple keywords for on page for ranking?
I understand using a keyword (or phrase) and correctly building that into the site structure (URL, Title Tag, body, etc). So, this question is going to be elementary, but I am starting to question myself as I write content. I have a client, for example, that has a new site and a page for Chocolate cakes. Now the other pages they built out are for Cheesecakes, Cupcakes, etc. So we optimized the Chocolate cakes page with our keyword throughout (Getting an A+ on page content grade). But now they are asking me why they can't be found for chocolate eclairs, chocolate fudge cake, devils chocolate cake, double chocolate cake, etc. My first quick answer is that they should build more pages. But am I doing this wrong?
On-Page Optimization | | cschwartzel0 -
Not using H1's with keywords to simulate natural non SEO'd content?
There has been a lot of talk lately about making a website seem like it is not SEO'd to avoid over optimization penalties with the recent Google Algorithmic updates. Has anyone come across the practice of not using Headings (H1's, H2's etc..) properly to simulate that the current webpage isn't over optimized? I've come across a site that used to use multiple keywords within their headings & now they are using none. In fact they are marking their company name & logo as an H1 and non keyworded H2's such as our work or Contact. Is anyone holding back on their old SEO tactics to not seem over optimized to Google? Thanks!
On-Page Optimization | | DCochrane0 -
Why would my homepage be ranked lower (Page Rank 2) than my other pages on the site (PR3) ?
Why would my homepage be ranked lower (Page Rank 2) than my other pages on the site (PR3) ?
On-Page Optimization | | dmurtagh0 -
Re-optimizing onsite SEO, can it hurt?
We finished the re-design of our website a few months ago. We have hired a few freelance SEO guys that were horrible. We then decided to pull the SEO work in-house. I got nominated to do the SEO work. I started with what I thought was pretty good on-site SEO. At that time, with no experience, I was pretty proud of myself. I managed to get a bunch of our pages at top SERPs for long-tail keywords. Good enough for then. Now when I go look at the pages, I'm embarrassed to admit that it's my work. Please be kind. 🙂 Since then I have been trying to learn as much as possible about SEO. I'm certainly far ahead of where I was a few months ago. For the past few weeks I've been trying to focus my efforts on creating original keyword rich original content. Our competitors all have tried this, but their content is hardly readable by humans. Anyhow, we finished our fist article, it got indexed by G almost immediately and started to push our keyword SERPs up within just a few days. Now for my question. I have a much better understanding of on-page SEO and realize that I could make many improvements to ALL of our other pages. However, these pages are already doing fairly well with the SERPS and are moving up a few ranks a week. I'm very tempted to throw caution to the wind and completely redo all of our on-page SEO for our entire site. Is this a good strategy?
On-Page Optimization | | dmac
Should I expect our SERPs to drop a little, a lot, or not at all? I look forward to your response. DMac0 -
What's the best strategy for reducing the number of links on a blog post?
I'd like to optimize my blog better for search. The first reccomendation I got from my SEOMoz Pro Campaign Crawl was that I needed to reduce the number of links per page on my site. I have lots of links from navigational items in the sidebar that people do click on. I'd really like to keep some or all of the tags and categories I list. Comments are another issue. Most of our posts get about 10 comments. However, our best posts get 50-100 comments. Those comments create a lot of links. I was planning on attempting to reduce the number of links using javascript but I guess Google understands javascript now. I may still do this b/c our pages are huge and some progressive rendering would likely help the user experience. Can you use javascript (ajax or otherwise) to limit the number of links on your page in a way that helps your SEO efforts? Any specific suggestions for reducing links that come from comments and navigational items? How much will reducing the number of links on a given page help with SEO? Any simple way to estimate or quantify this without diving in? Thanks in advance!
On-Page Optimization | | TaitLarson0