Googlebot Crawl Rate causing site slowdown
-
I am hearing from my IT department that Googlebot is causing as massive slowdown/crash our site. We get 3.5 to 4 million pageviews a month and add 70-100 new articles on the website each day. We provide daily stock research and marke analysis, so its all high quality relevant content. Here are the crawl stats from WMT:
I have not worked with a lot of high volume high traffic sites before, but these crawl stats do not seem to be out of line. My team is getting pressure from the sysadmins to slow down the crawl rate, or block some or all of the site from GoogleBot.
Do these crawl stats seem in line with sites? Would slowing down crawl rates have a big effect on rankings?
Thanks
-
Similar to Michael, my IT team is saying Googlebot is causing performance issues - specifically during peak hours.
It was suggested that we consider using apache re-write rules to serve Googlebot a 503 during our peak hours to limit the impact. I found the stackoverflow thread (link below) in which John Muller seems to suggest this approach, but has anyone tried this?
-
Blocking googlebot is a quick and easy way to disappear from the Index. Not an option if you want Google to rank your site.
For smaller sites or ones with limited technologies, I sometimes recommend using a crawl-delay directive in robots.txt
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=48620
But I agree with both Shane and Zachary, this doesn't seem like the long term answer to your problems. Your crawl stats don't seem out of line for a site of your size, and perhaps a better hardware configuration could help things out.
With 70 new articles each day, I'd want Google crawling my site as much as they pleased.
-
whatever Google's default is in GWT - It sets it for you.
You can change it, but it is not reccomended unless for a specific reason (such as Michael Lewis's specific scenario) even though, I am not completely sold that Gbot is what is causing the "dealbreaking" overhead.
-
what is the ideal setting on the crawler. i have been wondering about this for some time.
-
Hi,
Your admins saying that, is like someone saying "we need to shut the site down, we are getting to much traffic!" Common sys-admin response (fix it somewhere else)
4GB a day downloaded, is alot of Bot traffic, but it appears you are a "real time" site, that is probably actually helped and maybe even reliant on your high crawl rate....
I would upgrade hardware - or even look into some kind of off site cloud redundancy for failover (Hybrid)
I highly doubt that 4GB a day, is a "dealbreaker",but of course that is just based off the one image, and your admins probably have resource monitors - Maybe Varnish is an answer for static content to help lighten load???? Or CDN for file hosting to lighten bandwidth load?
Shane
-
We are hosting the site on our own hardware at a big colo. I know that we are upgrading servers but they will not be online until the end of July.
Thanks!
-
I wouldn't slow the crawl rate. A high crawl rate is good so that Google can keep their index of your website current.
The better solution is to reconsider your hardware and networking setup. Do you know how you are being hosted? From my own experience with a website of that size, a load balancer on two decent dedicated servers should handle the load without problems. Google crawling your pages shouldn't create noticeable overhead on the right setup.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unable to site crawl
Hi there, our website was revamped last year and Moz is unable to crawl the site since then. Could you please check what is the issue? @siteaudits @Crawlinfo gleneagles.com.my
Technical SEO | | helensohdg380 -
Please Help! Crawl & Site Errors - Will This Impact My SEO?
Hello Moz, I need urgent help. I remove a tonne of product pages and put everything into one product page to deal with duplicate content. I thought this was a good thing to do until I got an email from Google saying: "Googlebot identified a significant increase in the number of URLs on ****.com that return a 404 (not found) error. " I checked it out and found the problem: 4 Soft 404's
Technical SEO | | crocman
41 Not Found's What do I need to do to fix this? Is it a problem or should I just ignore? I removed all the pages on WordPress but I need to do it somehow manually through Google? I have worked so hard on my SERP's that this will destroy me if I'm penalised. Please can someone advise?0 -
Why does Bing bot crawl so aggressively?
We observer that the Bing bot is crawling our site very aggressively. We set Bing's crawl control so that it should not crawl us during heavy traffic hours, but that did not change a thing. Does anyone have the problem and even better a solution?
Technical SEO | | Roverandom1 -
Aggregate rating for products
hello Currently I'm working with a retail customer to optimize his search experience. There's some odd behaviour about aggregate ratings for products. Products without a price (price based on daily demand) are shown with proper delivering the 'rich snippet' search result. On the other hand, products with a price are shown normally without any mark up. Here's what I did for now checked mark ups code wise (used the markup validator) re-submitted xml sitemaps search query is (for test purposes): [product name] + [retailer brand name] the question is: why aren't just these specific aggregated ratings (with a price) shown and the other ones are shown properly? Furthermore, is it a question of relevancy (can't imagine that cause the search query is really specific to the needed result)? thank you!
Technical SEO | | f_ryf0 -
Mobile site backlinks?
Hello, Our mobile site redirects to desktop in a desktop browser and vice versa; however, they are different sites. This said, shouldn't the backlinks for our mobile site be the same as for our desktop site since one redirects to the other. We show no backlinks in my analysis? Any help or insight would be extremely appreciated! Thank you!
Technical SEO | | lfrazer1 -
Awful ranking after site redesign
Hello everyone, I have a situation here and I’d like to have your opinion about it. I am working on a site which has been recently redesigned from scratch: in a nutshell, as soon as the new site went live their rankings dropped and, of course, so did their visitors and so on.. The guys who redesigned the site didn’t do any 301 redirect whatsoever, so now the old pages are just 404s and blocked by robots. My question is: if they 301 redirect now, do you think it would be possible they could get their rankings back? One more thing: when they launched the new site, the indexed pages basically doubled overnight; there were 700 and now there are 1400. Do you think this could affect their ranking as well? Thank you for you insights 🙂 Elio
Technical SEO | | Eyah0 -
Site Map
For a long time our site map used to be http://www.efurniturehouse.com/sitemap.xml recently our hosting company changed the site map to: http://www.efurniturehouse.com/xml-sitemap.ashx I went ahead and submitted the new site maps to both Google Webmaster and Bing. I submitted the Google one on Monday and it states PENDING. ( A day later this pending) I just submitted the map to Bing. I now have 2 site maps on each. 1)Is having 2 a problem Will they ignore the old site map or can we delete and if so when can we delete I appreciate your input Regards Tony www.eFurnitureHouse.com
Technical SEO | | OCFurniture0 -
301'ing googlebot
I have a client that has been 301’ing googlebot to the canonical page. This is because they have a cart_id and session parameters in urls. This is mainly from when googlebot comes in on a link that has these parameters in the URL, as they don’t serve these parameters up to googlebot at all once it starts to crawl the site.
Technical SEO | | AlanMosley
I am worried about cloaking; I wanted to know if anyone has any info on this.
I know that Google have said that doing anything where you detect goolgebots useragent and treat them different is a problem.
Anybody had any experience on this, I would be glad to hear.0