Googlebot cannot access your site
-
"At the end of July I received a message in my Google webmaster tools saying "Googlebot can't access your site"
We checked our robots.txt file and removed a line break in it, and then I had Google Fetch the file again. I have not received any more messages since then.
When we created the website I wrote all of the content and optimized each page for about 1 local keyword. A few weeks after I checked my keywords and did have a few on the first page of google. Since then almost all of them have completely disappeared. Because we had not link building effort I would not expect to still be on the first page, but I should definitely be seeing them before the 5th or even 10th page of Google.
The address is http://www.tile-pompanobeach.com
I'm not sure if these horrible results have something to do with the message from Google or something else.
The problem is this client now wants to sign a contract with us for SEO and I really have no Idea what happened and if I will be able to figure it out.
The main keyword for my home page is tile pompano beach
and I aslo was using Pompano Beach Tile store for the About page which was previously on the first page of Google.
Does anyone have some input?
-
Have you tried going to Google WMT and performing a Fetch as Googlebot? Do you have the old GA tracking code from before the asynchronous code was used?
If the answer to both questions is yes, try updating your GA code.
http://blog.jitbit.com/2012/08/fixing-googlebot-cant-access-your-site.html
-
Hi Anderson,
Just took a very quick look at your source code and my first reaction would be that there is something "unique" happening with the naming within your site structure
To be more precise ... you have multiple instances like this in the code:
http://www.tile-pompanobeach.com/sites/tile-pompanobeach.com/files/logo.png
since the keyword you are targeting for that page is tile pompano beach, it is not much of a stretch to think perhaps the repetition could be an attempt to manipulate rankings.
Given that you are saying rankings have plummeted to as low as page 10, I would guess that the repetition of your keyword term (exact match domain) in the source code is being seen as keyword stuffing.
It would be interesting to know exactly when the rankings went into freefall, but my guess would be perhaps around the end of April.
Hope that helps,
Sha
-
I don't think the issue is from your domain or the notifications. I received a similar notification but I looked into it and realized my site had gone offline for a little bit of time.
The site looks pretty new. I suspect a lack of links and the fact that Google actually lives in several different data centers (that update their index at different times) is the factor... just my guess though.
-
Oh and I have been reading a few things on using hyphens in URL's. Before I had read something saying it was good because google reads the keywords separately. But I read that a while ago. Now when I looked into it I am seeing that some people think (and have tested) that using a hyphen in the websites main URL can hurt SEO.
Do you think my domain could be the problem
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will putting a one page site up for all other countries stop Googlebot from crawling my UK website?
I have a client that only wants UK users to be able to purchase from the UK site. Currently, there are customers from the US and other countries purchasing from the UK site. They want to have a single webpage that is displayed to users trying to access the UK site that are outside the UK. This is fine but what impact would this have on Google bots trying to crawl the UK website? I have scoured the web for an answer but can't find one. Any help will be greatly appreciated. Thanks 🙂
Technical SEO | | lbagley0 -
Internal Ads on A Site
We serve ads on our site using a sub-domain. All ads use a re-direct from ads.domain before redirecting users to the proper, normal, internal url. Most the content on our home page is ad block driven. Is it possible and does it make sense to enter the sub-domain as url parameter in Google Webmaster tools, letting Google know that this is something to be ignored. Many thanks
Technical SEO | | CeeC-Blogger0 -
Site Redesign - Regaining Rankings
We just finished designing a whole new site that will hopefully convert better than our previous site and we are currently coding it. We are hoping to get the site out in the next month or two (or three!). We want to know what to expect in regard to our sales from SEO. If you successfully launched a site redesign and your conversion rate improved, can you answer this question? How long will it take for my rankings to regain their initial ranking and then hopefully rank even higher?
Technical SEO | | EcomLkwd0 -
Redesigning the site with same Domain (IMP.)
technical SEO question - If we take down a site and use the same domain but just redesign the whole site. I guess sometimes in this case Google still keeps indexing old pages though they do not exist now! What the solution for this? Google suggests redirect them to a 404 page but in this case as its same domain- Is it possible that we throw 404 errors and redirect them to 404 page and this 404 page exists in the new site itself (but of course we don't have link our menu to this 404 page) (if that makes sense)? Would appreciate if you can suggest or add anything to above topic.
Technical SEO | | Personnel_Concept0 -
Oh no googlebot can not access my robots.txt file
I just receive a n error message from google webmaster Wonder it was something to do with Yoast plugin. Could somebody help me with troubleshooting this? Here's original message Over the last 24 hours, Googlebot encountered 189 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%. Recommended action If the site error rate is 100%: Using a web browser, attempt to access http://www.soobumimphotography.com//robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot. If your robots.txt is a static page, verify that your web service has proper permissions to access the file. If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure. If the site error rate is less than 100%: Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors. The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website. After you think you've fixed the problem, use Fetch as Google to fetch http://www.soobumimphotography.com//robots.txt to verify that Googlebot can properly access your site.
Technical SEO | | BistosAmerica0 -
302 error removing site from results
I have a client who had a screwy url structure based off of parameters and all. They hired a developer that added the keyword to the end of the url and set up 302 redirects to the new keyword included url. Since then the entire site has virtually gone missing in the results but it is not penalized. I put in a request with webmaster tools for reconsideration and they said there was no penalty. I only just found the 302 problem today and think this is probably the problem. Could this remove a site from the search results?
Technical SEO | | webfeatseo0 -
Removing links from another site
Hello, Some site that I have never been able to access as it is always down has over 3,000 links to my website. They disappeared the other week and our search queries dramatically improved but now they are back again in Google Webmaster and we have dropped again.I have contacted the site owner and got no response and I have also put in a removal form (though I am not sure this fits for that) and asked Google to remove as they have been duplicating our content also. It was in my pending section but has now disappeared.This links are really damaging our search and the site isnt even there. Do I have to list all 3,000 links in the link removal to Google or is there another way I can go about telling them the issue.Appreciate any help on this
Technical SEO | | luwhosjack0