Suggested crawl rate in google webmaster tools?
-
hey moz peeps, got a general question:
what is the suggested custom crawl rate in google webmaster tools? or is it better to "Let Google determine my crawl rate (recommended)"
If you guys have any good suggestions on this and site why that would be very helpful, thanks guys!
-
Hi David,
According to Google's webmasters help page, it states that the crawl rate option is just about how fast it crawls your site and not how often it does. Check out the link for more information about it.
Also, I wouldn't stress to much on the settings because the search engines are going to do what they do, just make sure to create good content and get it out there, the search engines will see this and depending on how well it is they we come more often on there own.
Hope this helps!
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=48620
-
thanks for that alan, yea id like to see what other have to say about this as well, im all open ears!
-
I believe it is much better to allow the search engines to handle it themselves. Check your logs to see what they are doing, so you know if they are hammering you server. If they are, you can slow them down using the robots.txt file On a large site, google will often such out a few hundred thousand pages per day. Yahoo will do more, but their crawler isn't always friendly or smart. Sometimes different Slurp crawlers will request the same pages. Bing emailed me six months ago and asked that we remove the 2second delay I added. I did that because they were hammering the server, but only sent a few thousand readers per day. Google doesn't usually hammer the server, but when 5 or 6 are hitting you at the same time, it can affect you site response time. - and then some of them want to penalize you for having a poor experience for visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use the Google disavow tool?
Hi I'm a bit new to SEO and am looking for some guidance. Although there is no indication in Webmaster tools that my site is being penalised for bad links, I have noticed that I have over 200 spam links for "Pay Day Loans" pointing to my site. (This was due to a hack on my site several years ago). So my question is two fold. Firstly, is it normal to have spammy links pointing to your site and secondly, should I bother to do anything about it? I did some research into the Disavow tool in Webmaster tools wonder I should use it to block all these links. Thanks
Technical SEO | | hotchilidamo0 -
Google Indexing - what did I missed??
Hello, all SEOers~ I just renewed my web site about 3 weeks ago, and in order to preserve SEO values as much as possible, I did 301 redirect, XML Sitemap and so on for minimize the possible data losses. But the problem is that about week later from site renewal, my team some how made mistake and removed all 301 redirects. So now my old site URLs are all gone from Google Indexing and my new site is not getting any index from Google. My traffic and rankings are also gone....OMG I checked Google Webmaster Tool, but it didn't say any special message other than Google bot founds increase of 404 error which is obvious. Also I used "fetch as google bot" from webmaster tool to increase chance to index but it seems like not working much. I am re-doing 301 redirect within today, but I am not sure it means anything anymore. Any advise or opinion?? Thanks in advance~!
Technical SEO | | Yunhee.Choi0 -
Suggestions on Website Recovery
Hello Mozzers! I have been tasked with recovering a site from partial link penalty that was previous brought to my attention for this website www.active8canada.com. Upon reviewing the site backlinks and reporting info in Google webmaster tools, I found there was no penalty showing, could it have expired? We spent the last few months doing link cleanup as we recognize that there was some bad links that needed to be addressed. We requested removal of all the bad links after spending time categorizing all of them. Targeting commercial anchor text and bringing those numbers back to acceptable levels. Following this we did a disavow of the bad links which could not be removed through requests. We are actively building out additional content for the website as we recognize that some pages have thin content. We have earned some links as well to show some positive signals during the cleanup but have seen no change for better or worse. My question is, does anyone else see anything else we could be missing here? Should I revisit links again? Some of the links we disavowed are still showing in our backlink reports, but I cross referenced our disavows with the existing backlink profile to try and get an accurate sense of the remaining links. We never saw a decline in ranks further after the disavow, so I'm lead to believe that the links we removed had little, if any impact. I am a little hesitant to begin earning new links through content and partnership outreach as I still feel something is off that I can't quite put my finger on. It was previously confirmed that there was a penalty, but without that showing now in Google webmaster tools I'm grasping at any possible angle I may have missed. If anyone had a couple minutes to spare to shed some light on this situation, it would be greatly appreciated!
Technical SEO | | toddmumford0 -
Google Dancing?
Hello, I was wondering why my website for some keywords goes from 2nd 3rd page in Google to 7th or even more sometimes? This happens since a while. Any suggestion? Thanks. Eugenio
Technical SEO | | socialengaged0 -
Google Webmasters Quality Issue Message
I am a consultant who works for a website www.skift.com. Today we received an automated message from Google Webmasters saying our site has quality issues. Since the message is very vague and obviously automated I was hoping to get some insight into whether this message is something to be very concerned about and what can be done to correct the issue.From reviewing the Webmasters Quality Guidelines, the site is not in violation of any of the guidelines. I am wondering if this message is generated as a results of licensing content from Newscred, as I have other clients who are licensing content from Newscred and getting the same message from Google Webmasters.Thanks in advance for any assistance.
Technical SEO | | electricpulp0 -
How Often is Site Crawled
Good morning- I saw some errors in my first crawl and immediately removed the pages from my website. I then re-created my XML sitemap and uploaded to Google. The question I have is will the site be crawled to recognize the changes in the next day or so? The pages were just placed on the site as test pages and never removed. The initial crawl that notified me it was done found the errors and were removed. Thanks for your help. Peter
Technical SEO | | VT_Pete0 -
CDN Being Crawled and Indexed by Google
I'm doing a SEO site audit, and I've discovered that the site uses a Content Delivery Network (CDN) that's being crawled and indexed by Google. There are two sub-domains from the CDN that are being crawled and indexed. A small number of organic search visitors have come through these two sub domains. So the CDN based content is out-ranking the root domain, in a small number of cases. It's a huge duplicate content issue (tens of thousands of URLs being crawled) - what's the best way to prevent the crawling and indexing of a CDN like this? Exclude via robots.txt? Additionally, the use of relative canonical tags (instead of absolute) appear to be contributing to this problem as well. As I understand it, these canonical tags are telling the SEs that each sub domain is the "home" of the content/URL. Thanks! Scott
Technical SEO | | Scott-Thomas0 -
Crawl Errors In Webmaster Tools
Hi Guys, Searched the web in an answer to the importance of crawl errors in Webmaster tools but keep coming up with different answers. I have been working on a clients site for the last two months and (just completed one months of link bulding), however seems I have inherited issues I wasn't aware of from the previous guy that did the site. The site is currently at page 6 for the keyphrase 'boiler spares' with a keyword rich domain and a good onpage plan. Over the last couple of weeks he has been as high as page 4, only to be pushed back to page 8 and now settled at page 6. The only issue I can seem to find with the site in webmaster tools is crawl errors here are the stats:- In sitemaps : 123 Not Found : 2,079 Restricted by robots.txt 1 Unreachable: 2 I have read that ecommerce sites can often give off false negatives in terms of crawl errors from Google, however, these not found crawl errors are being linked from pages within the site. How have others solved the issue of crawl errors on ecommerce sites? could this be the reason for the bouncing round in the rankings or is it just a competitive niche and I need to be patient? Kind Regards Neil
Technical SEO | | optimiz10