Googlebot take 5 times longer to crawl each page
-
Hello All
From about mid September my GWMT has show that the average time to crawl a page on my site has shot up from an average of 130ms to an average of 700ms and peaks at 4000ms.
I have checked my server error logs and found nothing there, I have checked with the hosting comapny and there are no issues with the server or other sites on the same server.
Two weeks after this my ranking fell by about 950 places for most of my keywords etc.I am really just trying to eliminate this as a possible cause, of these ranking drops. Or was it the Pand/ EMD algo that has done it.
Many Thanks
Si
-
Thank you for having a look
I made no strcutural changes around the time of the issues starting.
On the third graph in GWMT yes there is was a spike on the time spent downloading at it is still a lot higher than previously. I have add an image of it below.
There were two google update about two weeks later the latest Panda and the new EMD.
Most of the content has been written by myself from my own experience etc. There are some pages that I am in the process of removing / changing that are the same as other sites.
Until 4 months ago the layout was in fixed size nested tables etc, I am just about getting my head around CSS etc., to try and drag it in the 21st century.
-
Hi.
Based on the site size (number of pages) and format (code, elements and structure) and two speed test I just run on it and a trace-route (from Austria) looks like you don't have any issues with it from a technical point of view.
One thing you need to check is still possibile is the time spent downloding a page graph (the third one) from within GWMT. Did this spiked up in the same time when crawl pages went down ?
A few other questions you should consider:
-
did you do any changes - especially structure changes around the same time you've notice the issues ?
-
are there any public google updates in the same timeframe with those changes that you've notice ?( you can check them here: http://www.seomoz.org/google-algorithm-change )
-
is your content duplicate ? (with external sources I mean - not internally)
Please don't get me wrong - i would be ok with the format of the site if it will be very old - before 2000. But the domain is from 2008 - you should get on track with new trends as far as layout, content format and web site format in general.
Hope it helps.
-
-
Hi
I am as sure as I can be but not being a full expert on these things I may have missed something technical.
I have be making changes to the site since mainly on the css layout.
The site is www.growingyourownveg.com
Thanks
-
Hi,
As far as I know a low crawl rate won't end up with bad rankings but bad rankings will end up with a lower crawl rate.
If you are sure and I mean really sure you don't have any technical issues on your side that will influence the crawl rate and possibile also rankings then you should take in consideration that maybe you do actually have a -950 filter that is causing your rankings to drop, google dosen't consider your site an authority and for thsi reason it won't crawl your site often or as often as it used to do it.
Can you share the url of the site ? Just to have a look and see if at a first glance there are any obvios reason for google to dislike your site.
Cheers !
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A crawl revealed two home pages
After doing a site crawl using the moz tool, I have found two home pages-www.domain.com/ and www.domain.com. Both URLS have the exact same metrics and I have set a preferred domain name in google, will this hurt seo? Should I claim the www.domain.com/ as well as www.domain.com and domain.com in the search console? Thanks
Technical SEO | | Tom3_150 -
Linking Pages - 404s
Hello, I have noticed that we have recently managed to accrue a large number of 404s that are listed as Page Title/URL of Linking Page in Moz (e.g. http://www.onexamination.com/international/) but I do not know which site they are coming from, is there an easy why to find out or shall we just create redirects for them all? Thanks in advance for your help. Rose
Technical SEO | | bmjcai1 -
GWT False Reporting or GoogleBot has weird crawling ability?
Hi I hope someone can help me. I have launched a new website and trying hard to make everything perfect. I have been using Google Webmaster Tools (GWT) to ensure everything is as it should be but the crawl errors being reported do not match my site. I mark them as fixed and then check again the next day and it reports the same or similar errors again the next day. Example: http://www.mydomain.com/category/article/ (this would be a correct structure for the site). GWT reports: http://www.mydomain.com/category/article/category/article/ 404 (It does not exist, never has and never will) I have been to the pages listed to be linking to this page and it does not have the links in this manner. I have checked the page source code and all links from the given pages are correct structure and it is impossible to replicate this type of crawl. This happens accross most of the site, I have a few hundred pages all ending in a trailing slash and most pages of the site are reported in this manner making it look like I have close to 1000, 404 errors when I am not able to replicate this crawl using many different methods. The site is using a htacess file with redirects and a rewrite condition. Rewrite Condition: Need to redirect when no trailing slash RewriteCond %{REQUEST_FILENAME} !-f
Technical SEO | | baldnut
RewriteCond %{REQUEST_FILENAME} !.(html|shtml)$
RewriteCond %{REQUEST_URI} !(.)/$
RewriteRule ^(.)$ /$1/ [L,R=301] The above condition forces the trailing slash on folders. Then we are using redirects in this manner: Redirect 301 /article.html http://www.domain.com/article/ In addition to the above we had a development site whilst I was building the new site which was http://dev.slimandsave.co.uk now this had been spidered without my knowledge until it was too late. So when I put the site live I left the development domain in place (http://dev.domain.com) and redirected it like so: <ifmodule mod_rewrite.c="">RewriteEngine on
RewriteRule ^ - [E=protossl]
RewriteCond %{HTTPS} on
RewriteRule ^ - [E=protossl:s] RewriteRule ^ http%{ENV:protossl}://www.domain.com%{REQUEST_URI} [L,R=301]</ifmodule> Is there anything that I have done that would cause this type of redirect 'loop' ? Any help greatly appreciated.\0 -
Can I canonical the same page?
I have a site where I have 500+ Page listing pages and I would like to rel=canonical them to the master page. Example: http://www.example.com//articles?p=18 OR http://www.example.com/articles?p=65 I plan on adding this to the section from of the page template so it goes to all pages - When I do this, I will also add the canonical to the page I am directing the canonical. Is this a bad thing? Or allowed?
Technical SEO | | JoshKimber0 -
Should I change my targeted page?
Currently I have a site where the targeted keywords were on the home page, with links built to the homepage. It has been widely recognised though that Google is looking more and more for specific content on webpages that holds greater relevance to search queries. As such, I switched this targeted page to other created webpages - changing metatags and creating more relevant content for respective keywords. I thought this would improve rankings, however, upon doing this there was a sharp fall in rankings for keywords. Is there anything that I could have done wrong, or can do better so that keywords move back up the rankings?
Technical SEO | | Gavo0 -
Too Many Page Links
I have 8 niche websites for golf clubs. This was done to carve out tight niches for specific types of clubs then only broadens each club by type - i.e. better player, game improvement, max game improvement. So far, for fairly young sites, <1 year, they are doing fairly well as I build content. Running campaigns has alerted me to one problem - too many on-page links. And because I use Wordpress those links are on each page in the right sidebar and lead to the other sites. Even though visitors arrive via organic search in most cases they tend to eventually exit to one of the other sites or they click on a product (Ebay) and venture off to hopefully make a purchase. Ex: Drivers site will have a picture link for each of the other 7 sites. Question: If I have one stie (like a splash page) used as one link to that page listing all the sites with a brief explanation of each site will this cause visitors to bounce off because they will have one click, than the list and other clicks depending on what other club/site they would like to go to. The links all open in new windows. This would cut down on the number of links per page of each site but will it cause too much work for visitors and cause them to leave?
Technical SEO | | NicheGuy0 -
Have a client that migrated their site; went live with noindex/nofollow and for last two SEOMoz crawls only getting one page crawled. In contrast, G.A. is crawling all pages. Just wait?
Client site is 15 + pages. New site had noindex/nofollow removed prior to last two crawls.
Technical SEO | | alankoen1230 -
Diagnostic says too many links on a page and most of the pages are from blog entries. Are tags considered links? How do I decrease links?
I just ran my first diagnostic on my site and the results came back were negative in the area of too many links one a page. There were also quite a few 404 errors. What is the best way to fix these problems? Most of the pages with too many links are from blog posts, are the tags counted as well and is this the reason for too many links?
Technical SEO | | Newport10300