Crawl rate dropped to zero
-
Hello, I recently moved my site in godaddy from cpanel to managed wordpress. I bought this transfer directly from GoDaddy customer service. in this process they accidentally changed my domain from www to non www. I changed it back after the migration, but as a result of this sites craw rate from search console fell to zero and has not risen at all since then.
In addition to this website does not display any other errors, i can ask google manually fetch my pages and it works as before, only the crawl rates seems to be dropped permanently. GoDaddy customer service also claims that do not see any errors but I think, however, that in some way they caused this during the migration when the url changed since the timing match perfectly. also when they accidentally removed the www, crawl rate of my sites non www version got up but fell back to zero when I changed it back to www version. Now the crawl rate of both www and non www version is zero. How do I get it to rise again? Customer service also said that the problem may be related to ftp-data of search console? But they were not able to help any more than .Would someone from here be able to help me with this in anyway please?
-
Hello, asnwers to the questions bolded:
- At this rate, how long would it take Google to crawl all of your pages, (maybe it feels 10-15 is fast enough)? Over 50 days, i still cannot believe that it would be just a coincidence that crawl rate dropped so suddenly only because google suddenly thinks that my page should not be crawled that often. After all, amount of new content, quality of new links and all the other factors are much better all the time on my site, and before the drop, crawl rate increased steadily. It has to be some technical issue?
- Has the average response time increased? If so, maybe Google feels it's overloading the server & backing off. No, it has actually went down a little bit (not much though)
-
Interesting. I have 2 more thoughts:
- At this rate, how long would it take Google to crawl all of your pages, (maybe it feels 10-15 is fast enough)?
- Has the average response time increased? If so, maybe Google feels it's overloading the server & backing off.
-
Crawl rate still is extremely slow, average 10-15 per day except when i sent pages to be manually crawled, then it crawls those page. Before the drop the crawl rate was never under 200 per day and it was usually over 1000. anything more I can do? It seems to have no effect my rankings or anything else as l can see, but I still would like this be fixed. It has be something to do with the fact that i changed my hosting to godaddy managed wordpress hosting. but they have no clue about what could cause this. robot.txt file change seemed to have no effect or very minimum effect
-
Not that I'm aware of, unfortunately. Patience is an important skill when dealing with Google
-
Thanks! I will try that. I see that search console shows crawl rates with few days delay, is there somewhere i could check if it works instantly?
-
I thought of one other possibility: Your sitemap.xml is probably auto-generated, so this shouldn't be a problem, but check to make sure that the URLs in the sitemap.xml have the www.
Other than that I'm out of ideas - I would wait a few days to see what happens, but maybe someone else with more experience watching Google will have seen this before. If it does resolve, I'd like to know what worked.
-
I'm not convinced that robots.txt is causing your problem, but it can't hurt to change it back. In fact, while looking for instructions on how to change it I came across this blog post by Joost de Valk, (aka Yoast), that pretty much says you should remove everything that's currently in your robots.txt - and his arguments are right for everything:
- Blocking wp-content/plugins will stop Google from loading JS and/or CSS resources that it might need to render the page properly.
- Blocking wp-admin is redundant, because the wp-admin if it's linked it can still be found, and important pages already have an X-Robots HTTP header that says not to index them.
If you're using Yoast SEO, here are instructions on how to change the robots.txt file.
-
Hi, one more thing. Are you 100% sure tht robot.txt file hs nothing to do with this? It changed at the sime time when the problems started to occur. It used to be :
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.phpBut now it is :
User-agent: *
Crawl-delay: 1
Disallow: /wp-content/plugins/
Disallow: /wp-admin/At the sime time "blocked resources" notifications started to occur in search console.
Blocked Resources > Rendering without certain resources can impair the indexing of your web pages. Learn more.Status: 3/19/16152 Pages with blocked resources
This has to have something to do with it right?
-
Thank you for your answer, my answers bolded here below
- Do you see any crawl errors in the Google Search Console? **Nothing new after the crawl rate dropped, just some old soft 404 errors and old not found errors. **
- If you search for your site on Google, what do you see, (does your snippet look normal)? Yes everything looks perfectly normal, just like before when the crawl rate dropped
- How many pages does Google say it has indexed? Is it possible it's indexed everything and is taking a break, (does it even do that?) I dont thin this is possible, since the cralw rate dropped lmost instantly from average 400 to zero after the site migration.
One theory is: When you moved to the non-www version of the site, Google started getting 301s redirecting it from www to non-www, and now that you've gone back to www it's getting 301s redirecting it from from non-www to www, so it's got a circular redirect. If this is the problem, how should i start to get it fixed?
Here's what I would do to try to kick-start indexing, if you haven't already:
- Make sure you have the "Preferred Domain" set to the www version of your site in_ both the www and non-www versions of your site_ in Google Search Console. Yes that is how it has been all the time
- In the Search Console for the www-version of your site, re-submit your sitemap. Done
- In the Search Console for the www-version of your site, do a Fetch as Google on your homepage, and maybe a couple of other pages, and when the Fetch is done use the option to submit those pages for indexing, (there's a monthly limit on how much of this you can do). I have done this many times since i noticed the problem, fetch as google works normally without any issues
Is there anything more i can do? If i want hire someone to fix this, is there any recommendations? I am not a tech guy so this is quite difficult task for me
-
I don't know why this is happening, but this is what I would check:
- Do you see any crawl errors in the Google Search Console?
- If you search for your site on Google, what do you see, (does your snippet look normal)?
- How many pages does Google say it has indexed? Is it possible it's indexed everything and is taking a break, (does it even do that?)
One theory is: When you moved to the non-www version of the site, Google started getting 301s redirecting it from www to non-www, and now that you've gone back to www it's getting 301s redirecting it from from non-www to www, so it's got a circular redirect.
Here's what I would do to try to kick-start indexing, if you haven't already:
- Make sure you have the "Preferred Domain" set to the www version of your site in both the www and non-www versions of your site in Google Search Console.
- In the Search Console for the www-version of your site, re-submit your sitemap.
- In the Search Console for the www-version of your site, do a Fetch as Google on your homepage, and maybe a couple of other pages, and when the Fetch is done use the option to submit those pages for indexing, (there's a monthly limit on how much of this you can do).
Good luck!
-
That's not so horrible - it just says not to crawl the plugins directory or the admin, and to delay a second between requests. You probably don't want your plugins or admin directories being indexed, and according to this old forum post Google ignores the crawl-delay directive, so the robots.txt isn't the problem.
-
Hi, my robot.txt file looks like this:
User-agent: * Crawl-delay: 1 Disallow: /wp-content/plugins/ Disallow: /wp-admin/ This is not how it suppose to look like, right? could this cause the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Diagnostics: Duplicate Content Issues
The Moz crawl diagnostic is showing that I have some duplicate content issues on my site. For the most part, these are variations of the same product that are listed individually (i.e size/color). What would be the best way to deal with this? Choose one variation of the product and add a canonical tag? Thanks
Technical SEO | | inhouseseo0 -
Traffic Drop in 1 month!
Hi guys, ok, i'm at the end of my line with this one, my site www.golfclothingdirect.co.uk has taken a pounding since 4th April and i'm blowed if i can find a real reason for it. Traffic has been increasing slowly in line with budget, sales were picking up, and coming into season should have been better - 24th & 25th robot had an access issue - host contacted and said there had been a site issue, but resolved and now all A OK. Sales are terrible and traffic is grinding. All comments greatly taken on board! Thanks Mark
Technical SEO | | UrbanMark0 -
Google Impressions Drop Due to Expired SSL
Recently I noticed a huge drop in our clients Google Impressions via GWMT from 900 impressions to 70 overnight on October 30, 2012 and has remained this way for the entire month of November 2012. The SSL Cert had expired in mid October due to the notification message for renewal going to the SPAM folder and being missed. Is it possible for an SSL expiry to be related to this massive drop in daily impressions which in-turn has also effected traffic? I also can't see any evidence of duplicate pages (ie. https and http) being indexed but to be honest I'm not the one doing the SEO therefore haven't been tracking this. Thanks for your help! Chris
Technical SEO | | MeMediaSEO0 -
Sudden drop in Google with our top performing keywords
Hi, I'm writing about the sudden drop in our keyword rankings from our site www.activitybreaks.com. Our keywords that have significantly seen a drop have been activity holidays was 8th now 16th Adventure holidays was 15th now 71st We have been listed on the first page for a number of keywords but these has suddenly dropped in the last couple of days. We did receive a notice on the 19th May from Google stating that they detected unnatural links. So we spent a couple of weeks getting the links removed and have re-submitted the site on 11th June. When I go into Google webmaster there is no reply from Google as yet and the links are still showing even though we know they have been removed. We also noticed in the last couple of days that we had a duplicate home page but this has now since been removed. Should we re-submit our site to Google for reconsideration or wait to they get back to us. Is there anything else we can be doing to fix this situation. Let me know if you have any ideas! Anything is appreciated, thanks. Naomi
Technical SEO | | activitybreaks50 -
Site drop in SERPS after domain down for three plus weeks
Have had a clients site that has been staple in the SERPS for the last four years that we have worked on it. However they had hosting issues a couple of months back which they didn't sort for two to three weeks. Since there the main keyword has bounced around between 24th and 44th in the google results. The drop ties in with the hosting issue but the client does seem to believe that this is the source of the issue. Can someone point me towards some online articles that would explain this for them. Some of the keywords have not suffered as badly but the one key term is still in the mid 20's. I am sure it is suffering a minus 20 for the keyword as it was normally 4th or 5th in the results. We don't host the site and the client had know idea were it was hosted. What is so annoying is that it is us who pointed out the site was down. Thanks in advance everyone 🙂
Technical SEO | | highwayfive0 -
Will a drop in indexed pages significantly affect Google rankings?
I am doing some research into why we were bumped from Google's first page into the 3rd, fourth and fifth pages in June of 2010. I always suspected Caffeine, but I just came across some data that indicates a drop in indexed pages from 510 in January of that year to 133 by June. I'm not sure what happened but I believe our blog pages were de-indexed somehow. What I want to know is could that significant drop in indexed pages have had an effect on our rankings at that time? We are back up to over 500 indexed pages, but have not fully recovered our first page positions.
Technical SEO | | rdreich490 -
Google crawl rate almost zero since re-launch, organic search up 50% though!
We're confused as to why Google's crawl of our site has dropped hugely since our new site went live. The URLs of almost all pages changed, and were 301d to the new site. About 20% of our pages were blocked by robots.txt for the re-launch. The re-launch has been great for organic search, with hits up about 50%. Yet our new content is taking a lot longer to get indexed than before. Our KB downloaded a day according to webmaster tools are well down, as is time spent downloading a page. Any ideas as to why this is?i7hwX.png
Technical SEO | | soulnafein0 -
How far into a page will a spider crawl to look for text?
How far into a page will a spider crawl to look for text? I've heard a spider will only crawl the first 3kb, but can't find an authoritative source for that information.
Technical SEO | | crvw0