WMT - Googlebot can't access your site
-
Hi
On our new website which is just a few weeks old upon logging into Webmaster tools I am getting the following message
Googlebot can't access your site - The overall error rate for DNS queries is 50% What do I need to do to resolve this, I have never had this problem before with any of the sites - where the domains are with Fasthosts (UK) and hosting is with Dreamhosts. What is the recommended course of action Google mention contacting your host in my case Dreamhost - but what do you need to ask them in a support ticket. When doing a fetch in WMT the fetch status is a success?
-
Yes, it has to be a configuration entry in your host file with your website hosting company.
-
Your hosting company: dreamhost. (if a WMT fetch is ok - that means the DSN settings are ok and on a hosting level something might be wrong)
-
Which one should I contact?
The Domain Company - fasthots
or
The hosting company - dreamhost
-
Yes,that's right. They can check what's wrong...
-
Raise a ticket with the host or with the company that the domain is registered and managed from?
-
Google or a particular server could have the DNS cached, they might have multiple DNS servers etc causing it to be working with no issues.
If you have another domain with no similar issues, it could be a minor mis-config within the DNS entry for that domain only. Raise a ticket with your host with a screenshot of the issue from Google and their techs should know where to look.
I hope this helps.
-
Hi
I have other sites hosted with Dreamhost and there is no mention in WMT of this message?
Could it be something within the actual DNS of the domain itself? It has domain privacy set on the domain but it wouldnt be this that is causing it
Its also weird that when I do a fetch in Google WMT it always brings back a success message?
-
Do you have any other sites hosted with the same DNS ? It looks like a DNS issue for sure. If I were you, I would move my hosting. Maybe I am being paranoid, but I don't like DNS issues. It's not a SEO issue, it's a site up-time issue. If bots can't access my site 50% of the time, could it be that a lot of my user's can't access my website either ? 50% is a very big number. If possible, I would transfer my domain name to a different registrar and a different hosting company. Just trying to be safer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
Specific pages won't index
I have a few pages on my site that Google won't index, and I can't understand why. I've looked into possible issues with Robots, noindex, redirects, canonicals, and Search Console rules. I've got nothing. Example: I want this page to index https://tour.franchisebusinessreview.com/services/franchisee-satisfaction-surveys/ When I Google the full URL, I get results including the non-subdomain homepage, and various pages on the subdomain, including a child page of the page I want, but not the page itself. Any ideas? Thanks for the help!
Technical SEO | | ericstites0 -
What to do when half of my pages aren't being viewed?
My site is roughly 1000 pages. I've begun refreshing older content. I noticed about half of my pages have no incoming traffic. Should I look at combining some of these pages and 301 redirecting the former links to that new "bigger" page and then having my home page show that new consolidated content? They don't have good back links either. Example layout now: Home Page - Restaurants [show list of cuisines] - User clicks on Italian [show list of all Italian restaurants] - Choice 1 - Choice 2 Even though my main page is seen by about 100,000 people a month, it doesn't seem like anyone is interested in going down that path so none of the restaurants are clicked. How could I improve the user interface/experience and incorporate best Google practices? Thanks, Steve
Technical SEO | | recoil0 -
Disallowing WP 'author' page archives
Hey Mozzers. I want to block my author archive pages, but not the primary page of each author. For example, I want to keep /author/jbentz/ but get rid of /author/jbentz/page/4/. Can I do that in robots by using a * where the author name would be populated. ' So, basically... my robots file would include something like this... Disallow: /author/*/page/ Will this work for my intended goal... or will this just disallow all of my author pages?
Technical SEO | | Netrepid0 -
Can anyone show me a good example of Microdata for a site
Hi, i am just learning about microdata after being told that it would be a good idea to have this for my site www.in2town.co.uk as i understand it, if i put this in, then it will allow google to understand more about my page I have been reading about it here https://support.google.com/webmasters/bin/answer.py?hl=en&answer=99170 and would like to know if anyone is using it and if it is making a difference, also can anyone show me some good examples on a working website please i work in joomla so as i understand it, i would have to put this in each page to help google understand more about the page and its content any help would be great.
Technical SEO | | ClaireH-1848860 -
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is if an obvious fre for all generic directory doesn't look to have been hit by any updates is it a wise move recommending tit for removal?
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is, if an obvious free for all generic directory doesn't look to have been hit by any updates is it a wise move recommending it for removal on the basis that it is a free for all directory and could be hit in teh future?
Technical SEO | | fazza470 -
Sitemap for pages that aren't on menus
I have a site that has pages that has a large number, about 3,000, pages that have static URLs, but no internal links and are not connected to the menu. The pages are pulled up through a user-initiated selection process that builds the URL as they make their selections, but,as I said, the pages already exist with static URLs. The question: should the sitemap for this site include these 3,000 static URLs? There is very little opportunity to optimize the pages in any serious kind of way, if you feel that makes a difference. There is also no chance that a crawler is going to find its way to these pages through the natural flow of the site. There isn't a single link to any of these pages anywhere on the site. Help?
Technical SEO | | RockitSEO0 -
From your perspective, what's wrong with this site such that it has a Panda Penalty?
www.duhaime.org For more background, please see: http://www.seomoz.org/q/advice-regarding-panda http://www.seomoz.org/q/when-panda-s-attack (hoping the third time's the charm here)
Technical SEO | | sprynewmedia0