Crawl rate drop
-
Hi Guys,
I have a crawl rate drop in webmastertools and can't figure out way. In the last month I removed a lot o duplicate pages that we don't need anymore, there were at least 1.5 million pages. Can this be a motive?
-
Hi there.
Of course it can! The crawl rate is in pages per day, so if you remove pages (especially 1.5 million), there won't be as much to crawl
Also it can happen, if you have the same static pages and the crawl bot has crawled them all. Google doesn't crawl all pages all the time, they have limited resources. So, if you have launched or updated website recently and now not really updating it, you can see the change in crawl rate.
However, you can change crawl rate for 90 days, if you need them to crawl your website constantly (usually good for websites in process of reconstruction). Here: https://support.google.com/webmasters/answer/48620?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking drop
Hello there, Based on Moz's rank tracker our keywords ranking have been dropping in ranking. Does anyone know what might be the cause? We have been building quality "white hat" links which are very relevant to our niche Thanks, Robert
Intermediate & Advanced SEO | | roberthseo0 -
After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
Hello, I have just performed doing server migration 2 days back All's well with traffic moved to new servers But somehow - it seems that w.r.t previous host that on submitting a new article - it was getting indexed in minutes. Now even after submitting page for indexing - its taking bit of time in coming to Search Engines and some pages wherein content is daily updated - despite submitting for indexing - changes are not getting reflected Site name is - http://www.mycarhelpline.com Have checked in robots, meta tags, url structure - all remains well intact. No unknown errors reports through Google webmaster Could someone advise - is it normal - due to name server and ip address change and expect to correct it automatically or am i missing something Kindly advise in . Thanks
Intermediate & Advanced SEO | | Modi0 -
Google crawling different content--ever ok?
Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok. Of course, it is normally NOT ok, I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location. The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen. OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url), So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location. If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think? Are these scenarios a concern for getting penalized by Google? Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0 -
Dropping dramatically in keyword rankings
One of my clients has always ranked well for this keyword (janitorial services, nh). Then within a week, they dropped out of the top fifty in Google and now in other major search engines as well. My first thought of why the drop in rankings is due to duplicate listings within online directories. The previous marketing person on staff was listing the company more than once in these directories, and it wasn't discovered until later in the link building process. Sometimes the company was listed with "janitorial services" as part of the company name, and then listed again with "carpet cleaning" as part of the company name... sometimes with duplicate address, or using the po box instead - as if two companies. The odd thing in all this is that while they dropped in ranking for this keyword, they still come in usually 1st in Google Places for this keyword with 12 excellent reviews. And yet when I check their Google Places account, it says that it needs to be reverified, again, it doesn't meet the terms. (company is a family owned business, for over 30 years, they have a lot of potential). So all this duplication needs to be fixed, but how serious are duplicate listings on places like Manta, YellowPages, SuperPages, also Yahoo Business Local and Bing Business Directory? And now that "forensics" seems to be my task, any suggestions on how to start? Any processes I should go through with Google WebMaster? _Cindy And, too, if I could add, the site ranks very poorly for this keyword and while I have provided recommendations, and they understand the onsite issue, they have yet to go forward with implementation, making this a little more difficult issue.
Intermediate & Advanced SEO | | CeCeBar0 -
Bounce rate and average visit time in an e-commerce site
Dear all, I am managing a Belgian online pharmacy (www.pharma2go.be) . The online pharmacy has a quite high bounce rate (+/- 79%) and low avg. visit time (< 1 minute). This could somehow be related to a choices that have been made in the past to also build the site in English (but without English product texts available - only Dutch and French). The reason is that people all over Europe could order. Another reason could be that also product on prescription are shown which cannot be ordered. This was chosen to still offer the visitors the product leaflets as a service. I am wondering if it would be beneficial for SEO to remove the English version and the on prescription products. At least if this would lower bounce rate and increase the average visit time. Thanks for your input. Kindest regards, Stefaan
Intermediate & Advanced SEO | | stefaanva0 -
Can you explain why the site is dropping off Google every other week?
Can anyone offer any insight into why since the Google Panda update www.bedandbreakfastsguide.com has been fluctuating on Google so much? One week it's ranked as it used to be, the next it's nowhere to be seen? If you take a look at the screenshot of our traffic, this is the traffic after 75% loss (dropped in two stages) you'll see we get traffic for a week and then nothing. This has been happening for months. Some points that might be involved: Around the same time the SEO guys suggested setting the canonical url to www.bedandbreakfastsguide.com (before there wasn't one so traffic was coming from www. and non-www). A lot of the original urls have been consolidated and rel="canonical" added throughout The "pages" of results all have had a rel="canonical" set to page 1 Could it be that the www is competing with the non-www despite the 301 redirects. We're doing everything we can to help this client (and have reduced their site errors from the millions to low tens-of-thousands) so it's not filling them with confidence when their site just keeps plumetting! What's also irritating/odd is that some of their competitors -who used to be ranked lower and have sites which contradict every rulebook still rank high. Hopefully you can spot something we've missed. Tim I8PNL
Intermediate & Advanced SEO | | TimGaunt0 -
I try to apply best duplicate content practices, but my rankings drop!
Hey, An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated. http://www.domain.com.au/digital-inverter-generator-3300w/ and http://www.domain.com.au/shop/digital-inverter-generator-3300w/ The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option). This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc). Rankings went up and so did traffic. In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks. Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it? Ideas, suggestions?
Intermediate & Advanced SEO | | LukeyJamo0