Why might my websites crawl rate....explode?
-
Hi Mozzers,
I have a website with approx 110,000 pages. According to search console, Google will usually crawl, on average, anywhere between 500 - 1500 pages per day. However, lately the crawl rate seems to have increased rather drastically:
9/5/16 - 923
9/6/16 - 946
9/7/16 - 848
9/8/16 - 11072
9/9/16 - 50923
9/10/16 - 60389
9/11/16 - 17170
9/12/16 - 79809I was wondering if anyone could offer any insight into why may be happening and if I should be concerned?
Thanks in advance for all advice. -
Thank you Thomas.
-
Just to add to this, there is nothing inherently wrong with Google crawling more pages of your site. The only time I would modify the crawl rate is when the extra crawling is actually slowing your server down.
-
Hi There,
The crawl rate control was devised by Google to give control to the users, so that they can limit the server load that is created by constant crawling of the website.
So, it's up to you to decide whether you want to lower/limit it.
https://support.google.com/webmasters/answer/48620?hl=en
Thanks,
Vijay
-
Thank you Vijay, your response is very helpful. Do you know if there are any guidelines for optimal crawl rates? I tend to look at average pages crawled per day and multiply by 90. If that number is equal to or more than the amount of pages on-site, then we'd be good, right? Or is there a flaw in that logic?
-
Hi Thomas,
Thank you for responding. Yes, kind of. There are 40 main categories and each of those has upto 100 links to sub categories, and then the same again for sub-sub categories.
I've spent the last year cleaning it up and removing pages that didnt need to be there. Quite a lot of pages! In order to help Google find and index the important ones.
I will run it through Screaming Frog now, just to be sure!
-
hi There,
The following can be reasons for your crawl rate increase
- You have updated the content of the website recently or doing it regularly.
- You / someone from your end submitted the sitemap.xml to google again or doing it over and over.
- Your robots.txt was changed to give access to earlier blocked pages.
- Your or someone used ping services to let search engines know about your website. There are many manual ping services like Pingomatic and in the WordPress you can manually add more ping services to ping many search engine bots. You can find such a list at WordPress ping list post (http://www.shoutmeloud.com/wordpress-ping-list.html).
- You can also monitor and optimize Google Crawl rate using Google Webmaster Tools. Just go to the crawl stats there and analyze. You can manually set your Google crawl rate and increase it to faster or slower. Though I would suggest use it with caution and use it only when you are actually facing issues with bots not crawling your site effectively. You can read more about changing Google crawl rate here https://support.google.com/webmasters/?hl=en&answer=48620#topic=3309469 .
I hope this helps, if you have further queries , feel free to respond.
Regards,
Vijay
-
I wouldn't be concerned at all, have you got one section that expands into a load of other links? It could be that Google hasn't crawled properly for a while and then finds a section they haven't seen before and just goes mad.
Alternatively, have you crawled with screamingfrog or similar tool? Incase there's an issue you weren't aware of.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If website users don't accept GDPR cookie consent, does that prevent GA-GTM from tracking pageviews and any traffic from that user that would cause significant traffic decreases?
I've been doing a lot research on GDPR impact and implementation with GTM-GA for clients, but it's been 12 months since GDPR has gone live I haven't found anything on how GA traffic has been impacted if users don't accept cookie consent. However, I'm personally seeing GA accounts taking huge losses in traffic since implementing GDPR cookie solutions (because GTM/GA tags aren't firing until cookies are accepted). Is it common for websites to see significant decreases in traffic due to too many users not accepting cookie consent? Are there alternative solutions to avoid traffic loss like that and still maintain GDPR compliance? It seems to me that the industry underestimated how many people won't accept cookie consent. Most of the documentation and articles around GDPR's start (May 2018) didn't foresee or cover that aspect properly, everything seems to be technically focused with the assumption that if implemented properly most people would accept cookie consent, but I'm personally not seeing that trend and it's destroying GA data (lost traffic, minimal source attribution, inaccurate behavior data, etc). Thanks.
Reporting & Analytics | | Kickboard2 -
How can I stop deadlink checker from visiting my website?
It is messing up my Google Analytics traffic reporting. I can't figure out how to get it to stop. Do I filter it out in GA?
Reporting & Analytics | | pmull0 -
Wordpress site with increase number of Crawl(400 response Code) errors in Others section of GWT
I have a wordpress site http://muslim-academy.com/I check in Google Webmasters tool today and I see the increase number of errors in Others area of Google webmaster Tool.The error code is 400http://muslim-academy.com/%D8%B3%D9%8A%D8%B1%D8%A9-%D8%AA%D8%A7%D8%B1%D9%8A%D8%AE%D9%8A%D8%A9-%D9%84%D9%84%D8%B1%D8%A6%D9%8A%D8%B3-%D8%AC%D9%85%D8%A7%D9%84-%D8%B9%D8%A8%D8%AF-%D8%A7%D9%84%D9%86%D8%A7%D8%B5%D8%B1-2/%D8%B3%D9%....%3Cbr%20/%3E________________%3Cbr%20/%3E___________%3Ca%20href=?lang=zhOne of the example link of this error.Can you guide me why the number of errors are increasing and how to fix the existing errors.
Reporting & Analytics | | csfarnsworth0 -
Site re-crawled?
I've fixed many of my errors, but they're still showing in my dashboard. When will the site be crawled again?
Reporting & Analytics | | sakeith0 -
How can i see queries what my visitors are searching for, in my website?
Google analytics installed, but they are not showing. What additional things should i add there? I need to see most popular search queries and add more content to these pages for panda.
Reporting & Analytics | | bele0 -
Conversion Rate Question: Should I Measure Visits or Unique Visits?
When you measure conversion rates, is the equation: conversion rate = visits/conversions or conversion rate = unique visits/conversions I ask because it can actually make a pretty big difference in the conversion rate. For example, if you visit my ecommerce website 100 times before buying something (and assuming you're my only visitor), then my conversion rate is 100% _if I'm determining conversion rates by unique visits/conversions. _However, it's only 1% _if I'm determining conversion rates by visits/conversions. _Wow! Now this is clearly an extreme example, but it should serve to illustrate the point that in more reasonable cases, the way the data is measured can have a potentially significant impact on the conversion rate. Is there an industry standard for this? Am I missing something really basic? Also, here's a little bit of context for the question: I run an ecommerce website powered by the Magento CMS and I'm trying to measure my conversion rate in Google Analytics for individual products. Google Analytics shows me my site wide conversion rate, but apparently I have to do some customization in order to measure conversion rates on the product level. That's fine, but I want to make sure I'm measuring my product conversions in a standard way. Thanks for any and all help! Adam
Reporting & Analytics | | Adam-Perlman0 -
Distribution of SEOMOZ Ranking Metrics across all Monitored Websites
Hi, I'm looking into building a tool which would incorporate SEOMOZ ranking metrics and I have a few questions with regard to the ranking data which will help me develop it correctly. I have done a few searches, for the information but haven't found an answer to these questions on seomoz - if they exist somewhere then I would be grateful if you could point me in the right direction. 1/. When looking at the values of metrics such as MozRank and DA, is there any information on how this values are distributed amongst the ranked website - EG The mean / median values of DA and MozRank etc of all websites monitored is X & Y . If there were any distribution graphs for these values then even better. 2/. Along the lines of the above - are there any online resources showing similar information for google PR? 3/. What is the calcuation used in the scale of MozRank and DA - ie We have a scale of 1-10 for these values - be we also know the scale in not linear but logrithmic - ie Double the Links of a DA 5 website and you would not end up with a DA 10 website - so what is the calculation which defines the scale. A WBF about google PR by Rand indicated that a value of between 8-9 is used in Google PR - but what about the values used for DA and MozRank etc? Many thanks for your help
Reporting & Analytics | | James770 -
Bounce rates plummeted
In Google Analytics, my average bounce rates plummeted basically overnight. I went from a consistent average daily bounce rate of about 65% to an average daily bounce rate near 5%. My average number of visitors has stayed the same. I don't think there is any significant change I made to my site that may have caused this. Has anyone else had the same problem and know why it happened and how to fix it? Thanks in advance!
Reporting & Analytics | | Ericc220