Should I set a max crawl rate in Webmaster Tools?
-
We have a website with around 5,000 pages and for the past few months we've had our crawl rate set to maximum (we'd just started paying for a top of the range dedicated server at the time, so performance wasn't an issue).
Google Webmaster Tools has alerted me this morning that the crawl rate has expired so I'd have to manually set the rate again. In terms of SEO, is having a max rate a good thing?
I found this post on Moz, but it's dated from 2008. Any thoughts on this?
-
At first I assumed that by manually setting the crawl rate to the maximum, Google would crawl my website faster and more frequently. Our website has tens of thousands of pages so I didn't want Google missing any of it or taking a long time to index new content. We have new products added to the website daily and others that come off or change.
I'll let Google decide
-
Yep, they're a little vague here! But the answer is: Google will crawl your site at whatever rate it wants (it's probably crawling Amazon 24/7), unless you limit how much it can crawl in Google Webmaster Tools. Then, Google will crawl your site at whatever rate it wants, unless than rate is higher than the limit you put in, and then it will limit itself.
If you're anxious for Google to crawl your site more because a) you have something that's changed and you want Google to have it in their index, or b) because you're hoping it'll affect your rankings:
a) If there's specific information that you want Google to update its index with, submit the URL of the page that's new or changed into "Fetch as Googlebot" and then, once you fetch it, hit the "Submit to index" button to the right. I work on a site that's a DA 58 and fetching something as Googlebot updates the index within an hour.
b) How much Google crawls your site has to do with how important your site is; forcing Google to crawl your site more will not make it think your site is more important.
Hope this helps!
Kristina
-
Is selecting "Limit Google's maximum crawl rate" and then manually moving the rate to the highest (0.2 requests per second / 5 seconds between requests) a higher rate than selecting "Let Google optimize for my site (recommended)"? Google don't really expand on this! I want them to crawl at the very maximum but they don't tell us how many requests per second and seconds between requests are involved when selecting the optimized option.
-
You don't need to. Just let Google crawl at will. The only reason you would want to limit the crawl rate is if you're having performance issues from the server you're on (too much traffic at once). If you're not having any issues, then allow Google to crawl as many pages as they can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML sitemap generator only crawling 20% of my site
Hi guys, I am trying to submit the most recent XML sitemap but the sitemap generator tools are only crawling about 20% of my site. The site carries around 150 pages and only 37 show up on tools like xml-sitemaps.com. My goal is to get all the important URLs we care about into the XML sitemap. How should I go about this? Thanks
Intermediate & Advanced SEO | | TyEl0 -
Do you know if there is a tool that can tell you if a url have backlink?
Hi, Do you know if there is a tool that I can check backlinks for thousands of URLs Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Whats the best way to set up a directory listing website
Hello all, I am building a website that lists homeschool events and field trips across various states (locker-time.com) and I have a few questions on setting it up correctly. Both the events and field trips are searchable by distance. For clarification, events are associated with a specific date and time and field trips are not. I currently have a link that says homeschool events and you enter your zip to find things close by. Is it better to create a separate page for each state I am targeting instead? So the link would be homeschool events and then a sub-link that says homeschool events in GA and the GA page brings up all the events in GA, still searchable by zip. Or does it matter? I was thinking if its a separate page, I could put keyword rich copy on top, but then clicking on the menu and choosing the appropriate sub-menu is an additional step for users on the site and as the number of states increase, that sub-menu could get pretty big. The search results pages lists the post title of any events or field trips found and the links go to a page on my website with more information, such as the location, details on the event / field trip and a link to their website. I am wondering for SEO purposes, is this the right way to do it? Or I could set up the results page to show an excerpt and some listing info and then link directly to their website. Does it matter? I was thinking a page on my own website since then I could add images (but that might end up sucking up all my hosting space). As I am adding these listings to my website, I simply copied/pasted the details on the event. Now that I'm thinking about it, original content is best, so should I stop doing that and rewrite the description in my own words? Since the events are date specific events and when they pass, they are no longer on the site, does it matter as much for the events? The field trips do not have dates associated with them, so I can probably work on creating my own descriptions for those. Just not sure if I should bother with events that are more short term. Thanks in advance for ANY advice or suggestions. I'm so looking forward to getting this all set up correctly! I find working on this SEO stuff such fun! Jeanette
Intermediate & Advanced SEO | | fatcreat0 -
ScreamingFrog won't crawl my site.
Hey guys, My site is Netspiren.dk and when I use a tool like Screaming Frog or Integrity, it only crawls my homepage and menu's - not product-pages. Examples
Intermediate & Advanced SEO | | FrederikTrovatten22
A menu: http://www.netspiren.dk/pl/Helse-Kosttilskud-Blandingsolie_57699.aspx
A product: http://www.netspiren.dk/pi/All-Omega-3-6-9-180-kapsler_1412956_57699.aspx Is it because the products are being loaded in Javascript?
What's your recommendation? All best,
Fred.0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
Why would my ip address show up in my webmaster tools links?
I am showing thousands of links from my servers ip address. What would cause that?
Intermediate & Advanced SEO | | EcommerceSite0 -
Do I need to set Preferred domain when the non-www redirects to www version?
Hi mozzers, Not sure if I should setup preferred domain when the non www version redirects to the www version? if yes why? Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Wordtracker vs Google Keyword Tool
When I find keyword opportunities in Wordtracker, I'll sometimes run them through Adwords Keyword tool only to find that Google says these keywords have 0 search volume. Would you use these keywords even though Google says users aren't searching for them?
Intermediate & Advanced SEO | | nicole.healthline0