Googlebot Crawl Rate causing site slowdown
-
I am hearing from my IT department that Googlebot is causing as massive slowdown/crash our site. We get 3.5 to 4 million pageviews a month and add 70-100 new articles on the website each day. We provide daily stock research and marke analysis, so its all high quality relevant content. Here are the crawl stats from WMT:
I have not worked with a lot of high volume high traffic sites before, but these crawl stats do not seem to be out of line. My team is getting pressure from the sysadmins to slow down the crawl rate, or block some or all of the site from GoogleBot.
Do these crawl stats seem in line with sites? Would slowing down crawl rates have a big effect on rankings?
Thanks
-
Similar to Michael, my IT team is saying Googlebot is causing performance issues - specifically during peak hours.
It was suggested that we consider using apache re-write rules to serve Googlebot a 503 during our peak hours to limit the impact. I found the stackoverflow thread (link below) in which John Muller seems to suggest this approach, but has anyone tried this?
-
Blocking googlebot is a quick and easy way to disappear from the Index. Not an option if you want Google to rank your site.
For smaller sites or ones with limited technologies, I sometimes recommend using a crawl-delay directive in robots.txt
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=48620
But I agree with both Shane and Zachary, this doesn't seem like the long term answer to your problems. Your crawl stats don't seem out of line for a site of your size, and perhaps a better hardware configuration could help things out.
With 70 new articles each day, I'd want Google crawling my site as much as they pleased.
-
whatever Google's default is in GWT - It sets it for you.
You can change it, but it is not reccomended unless for a specific reason (such as Michael Lewis's specific scenario) even though, I am not completely sold that Gbot is what is causing the "dealbreaking" overhead.
-
what is the ideal setting on the crawler. i have been wondering about this for some time.
-
Hi,
Your admins saying that, is like someone saying "we need to shut the site down, we are getting to much traffic!" Common sys-admin response (fix it somewhere else)
4GB a day downloaded, is alot of Bot traffic, but it appears you are a "real time" site, that is probably actually helped and maybe even reliant on your high crawl rate....
I would upgrade hardware - or even look into some kind of off site cloud redundancy for failover (Hybrid)
I highly doubt that 4GB a day, is a "dealbreaker",but of course that is just based off the one image, and your admins probably have resource monitors - Maybe Varnish is an answer for static content to help lighten load???? Or CDN for file hosting to lighten bandwidth load?
Shane
-
We are hosting the site on our own hardware at a big colo. I know that we are upgrading servers but they will not be online until the end of July.
Thanks!
-
I wouldn't slow the crawl rate. A high crawl rate is good so that Google can keep their index of your website current.
The better solution is to reconsider your hardware and networking setup. Do you know how you are being hosted? From my own experience with a website of that size, a load balancer on two decent dedicated servers should handle the load without problems. Google crawling your pages shouldn't create noticeable overhead on the right setup.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting old html site to new wordpress site
Hi I'm currently updating an old (8 years old) html site to wordpress and about a month ago I redirected some url's to the new site (which is in a directory) like this... Redirect 301 /article1.htm http://mysite.net/wordpress/article1/
Technical SEO | | briandee
Redirect 301 /article2.htm http://mysite.net/wordpress/article2/
Redirect 301 /article3.htm http://mysite.net/wordpress/article3/ Google has indexed these new url's and they are showing in search results. I'm almost finished the new version of site and it is currently in a directory /wordpress I intend to move all the files from the directory to the root so new url when this is done will be http://mysite.net/article1/ etc My question is - what to I do about the redirects which are in place - do I delete them and replace with something like this? Redirect 301 /wordpress/article1/ http://mysite.net/article1/
Redirect 301 /wordpress/article2/ http://mysite.net/article2/
Redirect 301 /wordpress/article3/ http://mysite.net/article3/ Appreciate any help with this0 -
How long after disallowing Googlebot from crawling a domain until those pages drop out of their index?
We recently had Google crawl a version of the site we that we had thought we had disallowed already. We have corrected the issue of them crawling the site, but pages from that version are still appearing in the search results (the version we want them to not index and serve up is our .us domain which should have been blocked to them). My question is this: How long should I expect that domain (the .us we don't want to appear) to stay in their index after disallowing their bot? Is this a matter of days, weeks, or months?
Technical SEO | | TLM0 -
Can the sudden addition of a new section of a website cause the site to be sandboxed?
Hi there, We have noticed a sudden drop in traffic for certain keywords. It seems to correlate precisely with a different department adding a whole new recruitment section to the site. However there doesn't appear to be any keyword cannibalization or duplicate content issues. Any ideas would be most appreciated. Thanks
Technical SEO | | Blaze-Communication0 -
Determining the Cause of a Penalty
I received a link removal request from a site who said that they were penalized. I confirmed that they were #1 for the competitive keyword phrase that is also their domain name and now they are #10. Here are some things I noticed about the site: Over 2,500 linking domains. Dozens of high quality linking domains like Huffington Post and Mashable. Some off topic guest post links, e.g. on a SEO site. Guest post anchor text was usually their site name which is an exact match domain. Lots of top 100 resource pages that received good organic links. Infographics with links using their domain name as the anchor text. Relatively few spammy links according to Open Site Explorer. Overall their site's links were engineered but using tactics that most would consider "white hat." I don't think they violated any Google Webmaster Guidelines. Why were they penalized? What do you think?
Technical SEO | | ProjectLabs0 -
Open Site Explorer Question
In OSE I have 3 of my top 5 pages as store.com, store.com/Default.asp, and store.com/default.asp -- I have a canonical version of at store.com/default.asp. I have inbound links coming to all three urls -- b/c OSE is listing these as seperate pages does that mean the link juice is not being consolidated? Or is this not something to worry about?
Technical SEO | | IOSC0 -
Googlebot take 5 times longer to crawl each page
Hello All From about mid September my GWMT has show that the average time to crawl a page on my site has shot up from an average of 130ms to an average of 700ms and peaks at 4000ms. I have checked my server error logs and found nothing there, I have checked with the hosting comapny and there are no issues with the server or other sites on the same server. Two weeks after this my ranking fell by about 950 places for most of my keywords etc.I am really just trying to eliminate this as a possible cause, of these ranking drops. Or was it the Pand/ EMD algo that has done it. Many Thanks Si
Technical SEO | | spes1230 -
Way to find how many sites within a given set link to a specific site?
Hi, Does anyone have an idea on how to determine how many sites within a list of 50 sites link to a specific site? Thanks!
Technical SEO | | SparkplugDigital0 -
How to setup tumblr blog.site.com to give juice to site.com
Is it possible to get a subdomain blog.site.com that is on tumblr to count toward site.com. I hoped I could point it in webmaster tools like we do www but alas no. Any help would be greatly appreciated.
Technical SEO | | oznappies0