Googlebot Crawl Rate causing site slowdown
-
I am hearing from my IT department that Googlebot is causing as massive slowdown/crash our site. We get 3.5 to 4 million pageviews a month and add 70-100 new articles on the website each day. We provide daily stock research and marke analysis, so its all high quality relevant content. Here are the crawl stats from WMT:
I have not worked with a lot of high volume high traffic sites before, but these crawl stats do not seem to be out of line. My team is getting pressure from the sysadmins to slow down the crawl rate, or block some or all of the site from GoogleBot.
Do these crawl stats seem in line with sites? Would slowing down crawl rates have a big effect on rankings?
Thanks
-
Similar to Michael, my IT team is saying Googlebot is causing performance issues - specifically during peak hours.
It was suggested that we consider using apache re-write rules to serve Googlebot a 503 during our peak hours to limit the impact. I found the stackoverflow thread (link below) in which John Muller seems to suggest this approach, but has anyone tried this?
-
Blocking googlebot is a quick and easy way to disappear from the Index. Not an option if you want Google to rank your site.
For smaller sites or ones with limited technologies, I sometimes recommend using a crawl-delay directive in robots.txt
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=48620
But I agree with both Shane and Zachary, this doesn't seem like the long term answer to your problems. Your crawl stats don't seem out of line for a site of your size, and perhaps a better hardware configuration could help things out.
With 70 new articles each day, I'd want Google crawling my site as much as they pleased.
-
whatever Google's default is in GWT - It sets it for you.
You can change it, but it is not reccomended unless for a specific reason (such as Michael Lewis's specific scenario) even though, I am not completely sold that Gbot is what is causing the "dealbreaking" overhead.
-
what is the ideal setting on the crawler. i have been wondering about this for some time.
-
Hi,
Your admins saying that, is like someone saying "we need to shut the site down, we are getting to much traffic!" Common sys-admin response (fix it somewhere else)
4GB a day downloaded, is alot of Bot traffic, but it appears you are a "real time" site, that is probably actually helped and maybe even reliant on your high crawl rate....
I would upgrade hardware - or even look into some kind of off site cloud redundancy for failover (Hybrid)
I highly doubt that 4GB a day, is a "dealbreaker",but of course that is just based off the one image, and your admins probably have resource monitors - Maybe Varnish is an answer for static content to help lighten load???? Or CDN for file hosting to lighten bandwidth load?
Shane
-
We are hosting the site on our own hardware at a big colo. I know that we are upgrading servers but they will not be online until the end of July.
Thanks!
-
I wouldn't slow the crawl rate. A high crawl rate is good so that Google can keep their index of your website current.
The better solution is to reconsider your hardware and networking setup. Do you know how you are being hosted? From my own experience with a website of that size, a load balancer on two decent dedicated servers should handle the load without problems. Google crawling your pages shouldn't create noticeable overhead on the right setup.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bingbot appears to be crawling a large site extremely frequently?
Hi All! What constitutes a normal crawl rate for daily bingbot server requests for large sites? Are any of you noticing spikes in Bingbot crawl activity? I did find a "mildly" useful thread at Black Hat World containing this quote: "The reason BingBot seems to be terrorizing your site is because of your site's architecture; it has to be misaligned. If you are like most people, you paid no attention to setting up your website to avoid this glitch. In the article referenced by Oxonbeef, the author's issue was that he was engaging in dynamic linking, which pretty much put the BingBot in a constant loop. You may have the same type or similar issue particularly if you set up a WP blog without setting the parameters for noindex from the get go." However, my gut instinct says this isn't it and that it's more likely that someone or something is spoofing bingbot. I'd love to hear what you guys think! Dana
Technical SEO | | danatanseo1 -
Google stopped crawling my site. Everybody is stumped.
This has stumped the Wordpress staff and people in the Google Webmasters forum. We are in Google News (have been for years), and so new posts are crawled immediately. On Feb 17-18 Crawl Stats dropped 85%, and new posts were no longer indexed (not appearing on News or search). Data highlighter attempts return "This URL could not be found in Google's index." No manual actions by Google. No changes to the website; no custom CSS. No Site Errors or new URL errors. No sitemap problems (resubmitting didn't help). We're on wordpress.com, so no odd code. We can see the robot.txt file. Other search engines can see us, as can social media websites. Older posts still index, but loss of News is a big hit. Also, I think overall Google referrals are dropping. We can Fetch the URL for a new post, and many hours later it appears on Google and News, and we can then use Data Highlighter. It's now 6 days and no recovery. Everybody is stumped. Any ideas? I just joined, so this might be the wrong venue. If so, apologies.
Technical SEO | | Editor-FabiusMaximus_Website0 -
What should I do with URLs that cause site map errors?
Hi Mozzers, I have a client who uses an important customer database and offers gift cards via https://clients.mindbodyonline.com located within the navigation which causes sitemap errors whenever it is submitted since domain is different. Should I ask to remove those links from navigation? if so where can I relocate those links? If not what should I do to have a site map without any errors? Thanks! 1n16jlL.png
Technical SEO | | Ideas-Money-Art0 -
Re-using site code.
Hi, I'm looking at launching a new website, and am keen to understand whether re-using the basic code behind one of my other sites will cause me an issue. I'll be changing the directory structure/ file names, etc - but it will basically leave me with a very similar-looking site to another in my portfolio - using code thats all ready out there, etc. Thanks, David
Technical SEO | | newstd1000 -
Redesigning the site with same Domain (IMP.)
technical SEO question - If we take down a site and use the same domain but just redesign the whole site. I guess sometimes in this case Google still keeps indexing old pages though they do not exist now! What the solution for this? Google suggests redirect them to a 404 page but in this case as its same domain- Is it possible that we throw 404 errors and redirect them to 404 page and this 404 page exists in the new site itself (but of course we don't have link our menu to this 404 page) (if that makes sense)? Would appreciate if you can suggest or add anything to above topic.
Technical SEO | | Personnel_Concept0 -
Should Canonical be used if your site does not have any duplicate
Should canonical be used site wide even if my site is solid no duplicate content is generated. please explain your answer
Technical SEO | | ciznerguy0 -
How do you diagnose if on your site is only 50% crawled?
Good Morning from 7 degrees C, goodbye arctic conditions wetherby UK, If a site had 100 pages for example & that site was plugged into Webmaster Tools how could you diagnose if all the pages had been crawled? The thing is I want to learn how to diagnose crawl issues with sites, is their a known methodology for this? Thanks in advance, David
Technical SEO | | Nightwing0 -
New Site Search Critique
Hi I am a huge fan of the SEOMOZ site and this great community which has helped me learn the current SEO skills I have now which are still very basic compared to the pros on the forum. I have tried to follow best practice regarding onsite and technical seo when developing my new site www.cheapfindergames.com and I would really appreciate it if experts on the forum could spare a minute to critique the site from a search perspective please This will give any elements of what onsite and technical SEO I done well and what aspects still need work. I am currently trying to build quality links and social mentions into the site which will take time, and the site has been designed around usability and conversions. Many Thanks Ian
Technical SEO | | ocelot0