Database Crash affecting our rankings
-
Here is one for you. We had a database crash, and was down for an hour and 15 minutes, on Dec 18th. During this time our system automatically sent out a 301 redirect to our product home page. (iboats.com/boat-parts-accessories/dm/)
On Dec 24th I noticed a huge ranking drop on a lot of our keywords, and some keywords that were serving up the redirect page from our database crash, in place of its normally ranked page.
So my assumption is that Google was crawling our site during the database crash and caching the redirect pages in place of our regularly ranked pages. So when a search query is made that would normally display our ranked page, it is now displaying the redirect page from the crawled cache.
And on other keywords where we normally would rank for, Google no longer has that page cached, so it drops it from ranking.
My question: Does this assumption sound accurate? If so, I'm assuming over time that our past ranked pages will show up once again after Google recrawls those pages and saves them in their cache.
I have several sites that were affected with this:
Thanks in advance for your input
-
Glad to hear you appear to be back on the right track
-
Update:
After unblocking Google, almost all of our rankings are back up, and in some cases a little improved. I was surprised how quickly Google recrawled and ranked the pages that dropped.
I did do a Fetch and submit on all my websites and pages. I'm not sure how Fetches are attended to by Google, but it appears they are.
I have an IT guy who doesn't feel a Fetch makes any difference. He says Google will crawl when they want to and probably does not pay attention to the Fetch request.
-
I wanted to get back to report the cause of our problem with a lot of our rankings dropping. We originally thought it was due to a database crash. And that still could be affecting some things.
We moved our servers to a new location. A couple of weeks ago, we switched over to some new IP addresses. When we did so, a program started blocking Google's IP addresses because they were hitting our servers with a lot of hits, and put those IP addresses on a blocked list. So our pages on some of the websites stopped getting crawled, and thus rankings dropped. We discovered the problem when exploring our Webmaster Tools Crawl Stats and noticed that the "kilobytes downloaded per day" had flat-lined. Doing a fetch on selected pages we could see that it was not fetching the page. So after unblocking those IP addresses, the pages got crawled, our crawl stats went back up, and our rankings started going back to where they were. We still have some rankings that have not bounced back, but assume they will as new crawls re-index those pages.
A lesson learned for me was to pay attention to the crawl stats. Any drastic variations could be a sign of a problem.
Brad
iboats.com -
I'm pretty sure that GWT is a little vague in terms of times that Googlebot visited with just the crawlrate shown on a day by day basis and there's some lag between now and when it happened. CPanel will show when spiders visited in latest visitors from memory, however I think that rather than looking back it would be better to simply ensure that this doesn't happen again with the 302 rather than the 301. Maybe refreshing of your sitemap submissions in GWT would help to get some crawling going on with a ping or two to get Googlebot back soon.
-
Also, can't we use GWT to find out the exact times or at least approximate times of when Google crawled a site? Or can we not find that information so precisely? In this case, it does seem that the 301 vs. 302 is the key problem, but also the fact that the system for getting rid of the 301 didn't work out properly.
-
It seems extreme that Google would react within an hour and 15 mins but who knows nowadays. Keep an eye on your GWT account and hopefully if you have a decent crawl rate things should remedy themselves. Stephen Salstrand made a good point. 302's rather than 301's should help. It's great to have something in place like a redirection for user experience in case of database failure, however, as Stephen said, it should be made clear to Google that it's only temporary.
-
Investigating it further, we have some permanent redirects that are still hanging. I've got my IT guys working on it as we speaking.
I've got to run, but I would like to revisit this soon. Once we have determined the problem, I will post it here.
Thanks for your opinions. They have helped.
-
Your assumptions are globally right. However, I have never seen Google react that quickly to a 301 redirect or even a crash, I always tell my customers "if you have a temporary issue with 404 and 500 erros, don't worry about it too much, Google won't react unless it is a permanent issue that lasts over three or four crawls".
This is why I would also investigate a penalty that just happened to take effect shortly after the crash. To rule that out, could you please tell us if some of your major keywords are still ranking well?
I don't think you sould worry too much about that, just wait a few days and see if some pages recover.
-
Good point on the 301's. I'll look into that.
Thanks, for your input!
-
I think your assumptions are on par. However, make sure to review that the 301 has been removed. Also, if you still want that auto redirect to happen on a crash, make it a 302 and not a 301 as a 301 will suck a lot of life out of your site (since SEs see it as permanent).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rank regional homepages using canonicals and hreflangs
Here’s a situation I’ve been puzzling with for some time: The situation
Technical SEO | | dmduco
Please consider an international website targeting 3 regions. The real site has more regions, but I simplified the case for this question. screenshot1.png There is no default language. The content for each regional version is meant for that region only. The website.eu page is dynamic. When there is no region cookie, the page is identical to website.eu/nl/ (because Netherlands is the most important region) When there is a region cookie (set by a modal), there is a 302 redirect to the corresponding regional homepage What we want
We want regional Google to index the correct regional homepages (eg. website.eu/nl/ on google.nl), instead of website.eu.
Why? Because visitors surfing to website.eu sometimes tend to ignore the region modal and therefor browse the wrong version.
For this, I set up canonicals and hreflangs as described below: screenshot2.png The problem
It’s 40 days now since the above hreflangs and canonicals have been setup, but Google is still ranking website.eu instead of the regional homepages.
Search console’s report for website.eu: screenshot3.png Any ideas why Google doesn’t respect our canonical? Maybe I’m overlooking something in this setup (combination of hreflangs and canonicals might be confusing)? Should I remove the hreflangs on the dynamic page, because there is no self-referencing hreflang? Or maybe it’s because website.eu has gathered a lot of backlinks over the years, whereas the regional homepages have much less, which might be why Google chooses to ig nore the canonical signals? Or maybe it’s a matter of time and I just need to wait longer? Note: I’m aware the language subfolders (eg. /be_nl) are not according to Google’s recommendations. But I’ve seen similar setups (like adobe.com and apple.com) where the regional homepage is showing ok. Any help appreciated!0 -
How does changing sitemaps affect SEO
Hi all, I have a question regarding changing the size of my sitemaps. Currently I generate sitemaps in batches of 50k. A situation has come up where I need to change that size to 15k in order to be crawled by one of our licensed services. I haven't been able to find any documentation on whether or not changing the size of my sitemaps(but not the pages included in them) will affect my rankings negatively or my SEO efforts in general. If anyone has any insights or has experienced this with their site please let me know!
Technical SEO | | Jason-Reid0 -
Do I need to specify the country in the lang tag? Is this going to affect my ranking?
Currently my website just has I am wondering if it is worth me changing it to instead. Will it change anything?
Technical SEO | | Sally940 -
Does anyone know if an increase in 804 HTTPS errors will affect SEO rankings?
We recently moved our whole site over from HTTP to HTTPS and we went from having 106 keywords in the top 3 positions to 80 in just one week. The only thing that I can think of that caused the drop is the HTTPS changes to our site. Any input would be greatly appreciated.
Technical SEO | | SimonWorsfold0 -
Site disappeared from rankings at 5th of October
Hello, I have a travel guide site at www.taylandgezirehberi.com It is the most detailed site about Thailand in Turkish language. It has 300 hundred pages of unique content that I wrote from scratch. Site was doing great at Google Turkey. I was at 2nd place for keyword "Tayland". Also I was in first 5 at almost every Tayland related keyword. I survived Penguin really good, but at 5th of October Penguin data refresh my site got totally trashed. Now site doesn't appear at any keyword at all. I suspect this maybe because of some link wheels that I bought about 8-9 months ago when I was inexperienced. Also I had 4-5 backlinks from my sites at other unique IP adresses. I admit doing mistakes, but I think the punishment was really harsh for a unique content site. I sent a reconsideration request, but got the answer "No manual spam actions found." I contacted the guy who did link wheel, had him removed some of the links. Also I used Disallow Tool and submitted some other spammy url's. This deranking effected me really bad, do you have some other suggestions that might help? Thanks a lot
Technical SEO | | gezginrocker0 -
Does Google News Inclusion Affect Organic Rankings?
Hello SEO Gurus, Here's a question I've been unable to find an answer for: if you manage to get a publishing website or blog included in the Google News aggregate, can it negatively affect organic search visibility? I've never read anything that explicitly says so, but I have both read and experienced how e-commerce sites often have difficulty in ranking high for both organic and shopping searches. It seems that Google balances out visibility between the two. Has anyone had any experience with a website or blog that managed to rank high for the same high-value keyword on both organic search and news search? Thanks in advance! Mike
Technical SEO | | RCNOnlineMarketing0 -
Which carries more weight Google page rank or Alexa Rank?
And how come do I see websites with Google PR of Zero and Alexa Page Rank in the top Thousands rank?
Technical SEO | | sherohass0 -
Nofollowing to boost internal page rankings.
I have a site with 200 links on the homepage, how much will it boost nofollowing the other links boost the 50 pages we care most about?
Technical SEO | | adamzski0