WMT - Googlebot can't access your site
-
Hi
On our new website which is just a few weeks old upon logging into Webmaster tools I am getting the following message
Googlebot can't access your site - The overall error rate for DNS queries is 50% What do I need to do to resolve this, I have never had this problem before with any of the sites - where the domains are with Fasthosts (UK) and hosting is with Dreamhosts. What is the recommended course of action Google mention contacting your host in my case Dreamhost - but what do you need to ask them in a support ticket. When doing a fetch in WMT the fetch status is a success?
-
Yes, it has to be a configuration entry in your host file with your website hosting company.
-
Your hosting company: dreamhost. (if a WMT fetch is ok - that means the DSN settings are ok and on a hosting level something might be wrong)
-
Which one should I contact?
The Domain Company - fasthots
or
The hosting company - dreamhost
-
Yes,that's right. They can check what's wrong...
-
Raise a ticket with the host or with the company that the domain is registered and managed from?
-
Google or a particular server could have the DNS cached, they might have multiple DNS servers etc causing it to be working with no issues.
If you have another domain with no similar issues, it could be a minor mis-config within the DNS entry for that domain only. Raise a ticket with your host with a screenshot of the issue from Google and their techs should know where to look.
I hope this helps.
-
Hi
I have other sites hosted with Dreamhost and there is no mention in WMT of this message?
Could it be something within the actual DNS of the domain itself? It has domain privacy set on the domain but it wouldnt be this that is causing it
Its also weird that when I do a fetch in Google WMT it always brings back a success message?
-
Do you have any other sites hosted with the same DNS ? It looks like a DNS issue for sure. If I were you, I would move my hosting. Maybe I am being paranoid, but I don't like DNS issues. It's not a SEO issue, it's a site up-time issue. If bots can't access my site 50% of the time, could it be that a lot of my user's can't access my website either ? 50% is a very big number. If possible, I would transfer my domain name to a different registrar and a different hosting company. Just trying to be safer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My company bought another company. How long do I keep the purchased company's site live?
How long do I wait before redirecting the purchased company's webpages to the equivalent pages on our site? Do I need to keep some sort of announcement about the merger live on their website for a while first?
Technical SEO | | movingbusinessestothecloud0 -
'domain:example.com/' is this line with a '/' at the end of the domain valid in a disavow report file ?
Hi everyone Just out of curiosity, what would happen if in my disavow report I have this line : domain:example.com**/** instead of domain:example.com as recommended by google. I was just wondering if adding a / at the end of a domain would automatically render the line invalid and ignored by Google's disavow backlinks tool. Many thanks for your thoughts
Technical SEO | | LabeliumUSA0 -
Why are only PDFs on my client's site being indexed, and not actual pages?
My client has recently built a new site (we did not build this), which is a subdomain of their main site. The new site is: https://addstore.itelligencegroup.com/uk/en/. (Their main domain is: http://itelligencegroup.com/uk/) This new Addstore site has recently gone live (in the past week or so) and so far, Google appears to have indexed 56 pdf files that are on the site, but it hasn't indexed any of the actual web pages yet. I can't figure out why though. I've checked the robots.txt file for the site which appears to be fine: https://addstore.itelligencegroup.com/robots.txt. Does anyone have any ideas about this?
Technical SEO | | mfrgolfgti0 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10 -
I have a WordPress site with 30 + categories and about 2k tags. I'd like to bring that number down for each taxonomy. What is the proper practice to do that?
I want to bring my categories down to about 8 or so and the tags... They're just a mess and I'd really like to bring that figure down significantly and setup a standard for usage. My thought was to remove the un-needed tags and categories and setup 301 redirects for the ones that I'm removing. Is that even necessary? Are there tools that can assist with this? What are the "gotchas" I should be aware of? Thanks!
Technical SEO | | digisavvy1 -
Used SEOMOZ top 100 Directories, my site ranking lowered, what can we do to fix this?
We have made a big mistake.... So what can we do to fix this? A trainee member of staff has used the seomoz 100 top directories and added to sites from PR10 to PR6 approx about 25 sites, using keywords were possible instead of using the website URL "which i now was stupid!. Our website ranking have been lowered big time for all keywords used!, eg from 1st to 10th and even disappeared from the top 100 We are contacting all directories asking for the Title link to be changed to the URL instead of a keyword.. Will this help? I understand that Google give sites a penalty for this!!, but what can i do to put this right and how long would this penalty last for? Any advice would be highly appreciated... Thanks Dean
Technical SEO | | deanpallatt0 -
Should I have a 'more' button for links?
I have a website that has a page for each town. rather than listing all the towns with a link to each, I want to show only the most popular towns and have a 'more' button that shows all of them when you click it. I know that the search engine can always see the full list of links and even though the visitor can't this doesn't go against Google guidelines because there is no deception involved, the more button is quite clear. However, my colleague is concerned that this is 'making life hard' for the search engines and so the pages are less likely to be indexed. I disagree. Is he right to worry about this??
Technical SEO | | mascotmike0