What would cause a drastic drop in pages crawled per day?
-
The site didn't go down.
There were no drop in rankings, or traffic.
But we went from averaging 150,000 pages crawled per day, to ~1000 pages crawled per day.
We're now back up to ~100,000 crawled per day, but we went more than a week with only 1000 pages being crawled daily.
The question is, what could cause this drastic (but temporary) reduction in pages crawled?
-
I wish that were the case, but the site wasn't down.
I looked into the errors, they were redirecting to a subdomain that no longer exists.
-
So several times in one month the entire site couldn't be reached. That's pretty significant. Personally I don't have any clients with that many down-times so can only assume that's the cause or at least a partial cause. And more important, a red flag that would prompt me to find a better hosting provider if it were my site.
-
The drop happened March 28th.
There was a "domain name not found" on march 30th (two more on the 22nd, 18th, 12th, and 10th)
-
There could be several factors. When did it occur? Did you see any other crawl errors reported? And unfortunately, the other unknown comes from the fact that Google's own system is both far from perfect and sometimes crawl volume is affected by their own system.
Unless I see crawl errors or an increase in pages not found during or leading up to that period, or more important, see a corresponding significant drop in organic traffic, personally I just chalk it up to the complexity of the web.
-
Hi Alan!
There were no spikes in kb per day or time spent downloading a page.
-
Fatwallet
Have you checked Google Webmaster Tools for crawl errors and other metrics? I had a client recently who had a severe slowdown in their server network which showed up on page crawl speed time as a huge spike - pages loading five times slower than normal. They subsequently had a dip in pages crawled due to the bottleneck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Drop
Over the last few months my rank has dropped by around half and for the life of me I can’t see why. There are no warnings on Google Console. Am I missing something? Website: thespacecollective.com
Intermediate & Advanced SEO | | moon-boots0 -
SEO mobile app optimization: multi tag link alternate media per every devices is acceptable in the desktop page?
Hi All, Hi hope someone could answer to this question because on internet I haven't found a clear solution so far: I have: 1 desktop website (let's make www.example.com) and different mobile websites for each main device (let's make iphone.example.mobi; android.example.mobi; winphone.example.mobi) In order to optimize my mobile websites, According to the Google guideline of the above separate urls configuration , I should add a tag link alternate media in the desktop page and a canonical tag in the corresponding mobile page in order to create a connection between them. But, I need to keep a 1-to-1 connection between desktop page and mobile page (Google recommends to have 1 desktop page linked to 1 mobile page and viceversa and discourages the 1-to-multi connections). What I would like: In my case, I have to add the a single desktop page of desktop site (example www.example.com/category1/), 3 links alternate media tag,( one for iphone.example.mobi, one for android.example.mobi and one for winphone.example.mobi). Furthemore, I have to add a canonical tag in every corresponding mobile page of the 3 mobile site version, a canonical tag pointing to my sektop page www.example.com/category1/. Now my worries are: having a single desktop page with 3 different link alternate tags pointing to 3 different mobile websites (one each), is something or not aligned to the google seo mobile guideline? If not, How should I configure my desktop website and my 3 mobile web applications(iphone, android, winphone) in order to follow the Google requirements for Separate urls apllication? Thanks, Massimliano
Intermediate & Advanced SEO | | AdiRste0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
Should we show(to google) different city pages on our website which look like home page as one page or different? If yes then how?
On our website, we show events from different cities. We have made different URL's for each city like www.townscript.com/mumbai, www.townscript.com/delhi. But the page of all the cities looks similar, only the events change on those different city pages. Even our home URL www.townscript.com, shows the visitor the city which he visited last time on our website(initially we show everyone Mumbai, visitor needs to choose his city then) For every page visit, we save the last visited page of a particular IP address and next time when he visits our website www.townscript.com, we show him that city only which he visited last time. Now, we feel as the content of home page, and city pages is similar. Should we show these pages as one page i.e. Townscript.com to Google? Can we do that by rel="canonical" ? Please help me! As I think all of these pages are competing with each other.
Intermediate & Advanced SEO | | sanchitmalik0 -
Duplicate Page Content Errors on Moz Crawl Report
Hi All, I seem to be losing a 'firefighting' battle with regards to various errors being reported on the Moz crawl report relating to; Duplicate Page Content Missing Page Title Missing Meta Duplicate Page Title While I acknowledge that some of the errors are valid (and we are working through them), I find some of them difficult to understand... Here is an example of a 'duplicate page content' error being reported; http://www.bolsovercruiseclub.com (which is obviously our homepage) Is reported to have 'duplicate page content' compared with the following pages; http://www.bolsovercruiseclub.com/guides/gratuities http://www.bolsovercruiseclub.com/cruise-deals/cruise-line-deals/holland-america-2014-offers/?order_by=brochure_lead_difference http://www.bolsovercruiseclub.com/about-us/meet-the-team/craig All 3 of those pages are completely different hence my confusion... This is just a solitary example, there are many more! I would be most interested to hear what people's opinions are... Many thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
What to do with WordPress generated pages?
I'm an SEOmoz Newbie and have a very specific question about the auto generated WordPress Pages. SEOmoz caught and labeled the auto generated WP pages as Crawl Warnings like: Long URL - 302 - Title Element to Long - Missing Meta Description Tag - Too Many On-Page Links So I have learned the lesson and have now made those pages "no follow" / "no idex." HOWEVER, WHAT DO I DO WITH THE ONES THAT HAVE ALREADY BEEN INDEXED? Do I... 1. Just leave them as is a hope they don't hurt me from an SEO perspective? 2. Redirect them all to a relevant page? I'm sure many people have had this issue. What do you think? Thanks Dominic
Intermediate & Advanced SEO | | amorbis0