Should I set a max crawl rate in Webmaster Tools?
-
We have a website with around 5,000 pages and for the past few months we've had our crawl rate set to maximum (we'd just started paying for a top of the range dedicated server at the time, so performance wasn't an issue).
Google Webmaster Tools has alerted me this morning that the crawl rate has expired so I'd have to manually set the rate again. In terms of SEO, is having a max rate a good thing?
I found this post on Moz, but it's dated from 2008. Any thoughts on this?
-
At first I assumed that by manually setting the crawl rate to the maximum, Google would crawl my website faster and more frequently. Our website has tens of thousands of pages so I didn't want Google missing any of it or taking a long time to index new content. We have new products added to the website daily and others that come off or change.
I'll let Google decide
-
Yep, they're a little vague here! But the answer is: Google will crawl your site at whatever rate it wants (it's probably crawling Amazon 24/7), unless you limit how much it can crawl in Google Webmaster Tools. Then, Google will crawl your site at whatever rate it wants, unless than rate is higher than the limit you put in, and then it will limit itself.
If you're anxious for Google to crawl your site more because a) you have something that's changed and you want Google to have it in their index, or b) because you're hoping it'll affect your rankings:
a) If there's specific information that you want Google to update its index with, submit the URL of the page that's new or changed into "Fetch as Googlebot" and then, once you fetch it, hit the "Submit to index" button to the right. I work on a site that's a DA 58 and fetching something as Googlebot updates the index within an hour.
b) How much Google crawls your site has to do with how important your site is; forcing Google to crawl your site more will not make it think your site is more important.
Hope this helps!
Kristina
-
Is selecting "Limit Google's maximum crawl rate" and then manually moving the rate to the highest (0.2 requests per second / 5 seconds between requests) a higher rate than selecting "Let Google optimize for my site (recommended)"? Google don't really expand on this! I want them to crawl at the very maximum but they don't tell us how many requests per second and seconds between requests are involved when selecting the optimized option.
-
You don't need to. Just let Google crawl at will. The only reason you would want to limit the crawl rate is if you're having performance issues from the server you're on (too much traffic at once). If you're not having any issues, then allow Google to crawl as many pages as they can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Budget and Faceted Navigation
Hi, we have an ecommerce website with facetted navigation for the various options available. Google has 3.4 million webpages indexed. Many of which are over 90% duplicates. Due to the low domain authority (15/100) Google is only crawling around 4,500 webpages per day, which we would like to improve/increase. We know, in order not to waste crawl budget we should use the robots.txt to disallow parameter URL’s (i.e. ?option=, ?search= etc..). This makes sense as it would resolve many of the duplicate content issues and force Google to only crawl the main category, product pages etc. However, having looked at the Google Search Console these pages are getting a significant amount of organic traffic on a monthly basis. Is it worth disallowing these parameter URL’s in robots.txt, and hoping that this solves our crawl budget issues, thus helping to index and rank the most important webpages in less time. Or is there a better solution? Many thanks in advance. Lee.
Intermediate & Advanced SEO | | Webpresence0 -
Adjustable Bounce Rate
Hi I've been looking at analysing bounce rate in more depth, I wondered what people's views on adjustable bounce rate were? I've been reading this article http://searchenginewatch.com/sew/how-to/2322974/how-to-implement-adjusted-bounce-rate-abr-via-google-tag-manager-tutorial Is it worth adding this? Or is it just as useful to look at time on page over bounce rate?
Intermediate & Advanced SEO | | BeckyKey0 -
What happens to a domain in SERPs when it's set to redirect to another?
We have just acquired a competing website and are wondering whether to leave it running as is for now, or set the domain to redirect to our own site. If we set up this redirect, what would happen to the old site in Google SERPs? Would the site drop off from results? If so, would we capture this new search traffic or is it a free for all and all sites compete for the search traffic as normal? Thanks in advance. Paul
Intermediate & Advanced SEO | | kevinliao0 -
Why Is Google Webmaster Tools Pulling Zero Keyword Data?
I just linked a Google Webmaster Tools account to Google Analytics for a client, and Search Engine Optimization reports are showing up in Google Analytics as enabled, but there is zero keyword data, landing page data, etc., in the reports themselves. Has anyone encountered this?
Intermediate & Advanced SEO | | yoursearchteam0 -
Tool that can retrieve mysite URL's
Hi, Tool that can retrieve mysite URL's I am not talking about href,open explorer, Majestic etc I have a list of 1000 site URL's where my site name is mentioned. I want to get the exact URL of my site next to the URL i want to query with Example http://moz.com/community is the URL i have and if this page has mysite name then i need to get the complete URL captured. Any software or tool that can do this? I used one for sure which got me this info but now i don't remember it Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Does Google index more than three levels down if the XML sitemap is submitted via Google webmaster Tools?
We are building a very big ecommerce site. The site has 1000 products and has many categories/levels. The site is still in construccion so you cannot see it online. My objective is to get Google to rank the products (level 5) Here is an example level 1 - Homepage - http://vulcano.moldear.com.ar/ Level 2 - http://vulcano.moldear.com.ar/piscinas/ Level 3 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/ Level 4 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes.html/ Level 5 - Product is on this level - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes/autocebante-recomendada-para-filtros-vc-10.html Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
Showing Duplicate Content in Webmaster Tools.
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
Intermediate & Advanced SEO | | EcommerceSite0 -
Does Bing(and Yahoo)Crawl AJAX Based Content?
I found this article and Bing appeared to at this time have a checkbox for the option for AJAX handling although looking at the Bing Webmaster Tools it doesnt appear that this option is available. Has it just been completely integrated now, relieving webmaster from needing to check the option or is it no longer supported? http://searchengineland.com/bing-now-supports-googles-crawlable-ajax-standard-84149
Intermediate & Advanced SEO | | imiJoe0