What would cause a drastic drop in pages crawled per day?
-
The site didn't go down.
There were no drop in rankings, or traffic.
But we went from averaging 150,000 pages crawled per day, to ~1000 pages crawled per day.
We're now back up to ~100,000 crawled per day, but we went more than a week with only 1000 pages being crawled daily.
The question is, what could cause this drastic (but temporary) reduction in pages crawled?
-
I wish that were the case, but the site wasn't down.
I looked into the errors, they were redirecting to a subdomain that no longer exists.
-
So several times in one month the entire site couldn't be reached. That's pretty significant. Personally I don't have any clients with that many down-times so can only assume that's the cause or at least a partial cause. And more important, a red flag that would prompt me to find a better hosting provider if it were my site.
-
The drop happened March 28th.
There was a "domain name not found" on march 30th (two more on the 22nd, 18th, 12th, and 10th)
-
There could be several factors. When did it occur? Did you see any other crawl errors reported? And unfortunately, the other unknown comes from the fact that Google's own system is both far from perfect and sometimes crawl volume is affected by their own system.
Unless I see crawl errors or an increase in pages not found during or leading up to that period, or more important, see a corresponding significant drop in organic traffic, personally I just chalk it up to the complexity of the web.
-
Hi Alan!
There were no spikes in kb per day or time spent downloading a page.
-
Fatwallet
Have you checked Google Webmaster Tools for crawl errors and other metrics? I had a client recently who had a severe slowdown in their server network which showed up on page crawl speed time as a huge spike - pages loading five times slower than normal. They subsequently had a dip in pages crawled due to the bottleneck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New page not being picked up
Hello, We have created a new page for truck rentals but for some reason it does not seem to be picked up. See this report: http://screencast.com/t/npYqeoa5gq The page is: https://www.globecar.com/en/montreal-truck-rentals but our main site is being pickedup instead vs the competition that has their truck page showing up. Can anyone help me understand? Thanks, Karim
Intermediate & Advanced SEO | | GlobeCar1 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
Product with two common names: A separate page for each name, or both on one page?
This is a real-life problem on my ecommerce store for the drying rack we manufacture: Some people call it a Clothes Drying Rack, while others call it a Laundry Drying Rack, but it's really the same thing. Search volume is higher for the clothes version, so give it the most attention. I currently have 2 separate pages with the On-Page optimization focused on each name (URL, Title, h1, img alts, etc) Here the two drying rack pages: clothes focused page and laundry focused page But the ranking of both pages is terrible. The fairly generic homepage shows up instead of the individual pages in Google searches for the clothes drying rack and for laundry drying rack. But I can get the individual page to appear in a long-tail search like this: round wooden clothes drying rack So my thought is maybe I should just combine both of these pages into one page that will hopefully be more powerful. We would have to set up the On-Page optimization to cover both "clothes & laundry drying rack" but that seems possible. Please share your thoughts. Is this a good idea or a bad idea? Is there another solution? Thanks for your help! Greg
Intermediate & Advanced SEO | | GregB1230 -
Best way for Google and Bing not to crawl my /en default english pages
Hi Guys, I just transferred my old site to a new one and now have sub folder TLD's. My default pages from the front end and sitemap don't show /en after www.mysite.com. The only translation i have is in spanish where Google will crawl www.mysite.com/es (spanish). 1. On the SERPS of Google and Bing, every url that is crawled, shows the extra "/en" in my TLD. I find that very weird considering there is no physical /en in my urls. When i select the link it automatically redirects to it's default and natural page (no /en). All canonical tags do not show /en either, ONLY the SERPS. Should robots.txt be updated to "disallow /en"? 2. While i did a site transfer, we have altered some of the category url's in our domain. So we've had a lot of 301 redirects, but while searching specific keywords in the SERPS, the #1 ranked url shows up as our old url that redirects to a 404 page, and our newly created url shows up as #2 that goes to the correct page. Is there anyway to tell Google to stop showing our old url's in the SERP's? And would the "Fetch as Google" option in GWT be a great option to upload all of my url's so Google bots can crawl the right pages only? Direct Message me if you want real examples. THank you so much!
Intermediate & Advanced SEO | | Shawn1240 -
Home Page or Internal Page
I have a website that deals with personalized jewelry, and our main keyword is "Name Necklace".
Intermediate & Advanced SEO | | Tiedemann_Anselm
3 mounth ago i added new page: http://www.onecklace.com/name-necklaces/ And from then google index only this page for my main keyword, and not our home page.
Beacuase the page is new, and we didn't have a lot of link to it, our rank is not so well. I'm considering to remove this page (301 to home page), beacause i think that if google index our home page for this keyword it will be better. I'm not sure if this is a good idea, but i know that our home page have a lot of good links and maybe our rank will be higher. Another thing, because google index this internal page for this keyword, it looks like our home page have no main keyword at all. BTW, before i add this page, google index our main page with this keyword. Please advise... U5S8gyS.png j50XHl4.png0 -
Webmaster Tools - Structured Data 100% drop. Many people with same issue, nobody seems to understand what might have caused it.
WMT shows a significant drop in structured data markup on June 7th, steep incline by June 21st. Now the same thing happened on August 9th, with no signs of recovery. Lost 45% of our search traffic. There are many people with the same problem, and nobody seems to know what caused it. Here are a few links to some forums: #1 Google Groups, #2 Google Groups, #3 Google Groups, #4 70% drop on GWT on June 7 Google SEO News and Discussion forum at WebmasterWorld. On our end we see a 100% drop in breadcrumbs and a 100% drop in hcards leading to a 45% search traffic drop. Any ideas why might have happened and how to fix this?
Intermediate & Advanced SEO | | PhilippGreitsch0 -
What constitutes a duplicate page?
Hi, I have a question about duplicate page content and wondered if someone is able to shed some light on what actually constitutes a "duplicate". We publish hundreds of bus timetable pages that have similar, but technically with unique urls and content. For example http://www.intercity.co.nz/travel-info/timetable/lookup/akl The template of the page is oblivious duplicated, but the vast majority of the content is unique to each page, with data being refreshed each night. Our crawl shows these as duplicate page errors, but is this just a generalisation because the urls are very similar? (only the last three characters change for each page - in this case /akl) Thanks in advance.
Intermediate & Advanced SEO | | BusBoyNZ0 -
How do you transition a keyword rank from a home page to a sub-page on the site?
We're currently ranking #1 for a valuable keyword, but the result on the SERP is our home page. We're creating a new product page focused on this keyword to provide a better user experience and create more relevant content. What is the best way to make a smooth transition to make the product page rank #1 for the keyword instead of the home page?
Intermediate & Advanced SEO | | buildasign0