I always get this error "We have detected that the domain or subfolder does not respond to web requests." I don't know why. PLEASE help
-
-
Thumbs up to Kane's advice.
If you checked with your host and 'everything was fine' but 10 minutes later the site was working again, you might want to get an uptime tracker
-
Yup, your site is back up now. All 4 pages are loading for me.
I'm willing to bet the host had some downtime and just got it fixed in the last couple hours, then proceeded to tell you it was fine (since it now is). Short downtimes happento a lot of hosts once in a while.
If it happens frequently - verify that it's not your fault, and then switch hosts. Should not be happening more than once or twice a year at a good host, and they should fix it promptly. They should also honor their 99.9% uptime guarantee if they have one.
Use an uptime tracker like http://www.monitor.us/website-monitoring or https://www.pingdom.com/ or http://www.siteuptime.com/ if it becomes a problem. (I have used the first one and it works fine, can't vouch for the others).
-
I did check with our hosting company and everything was working fine on their end so I am not exactly sure what is happening. I will try this again and see if the issue continues. Thank you for your help though.
-
I did check with our hosting company and everything was working fine on their end so I am not exactly sure what is happening. I will try this again and see if the issue continues. Thank you for your help though.
-
Perfect advice, sir. Thumbs up
-
Where is this error coming from? Your browser? Google Webmaster Tools?
I can't get any pages on your site to load, including indexed pages from Google - if you haven't spoken with your website hosting company or webmaster yet, that should be your first stop. More likely than not your host will be able to determine what is wrong.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console "Text too small to read" Errors
What are the guidelines / best practices for clearing these errors? Google has some pretty vague documentation on how to handle this sort of error. User behavior metrics in GA are pretty much in line with desktop usage and don't show anything concerning Any input is appreciated! Thanks m3F3uOI
Technical SEO | | Digital_Reach2 -
URL Structure On Site - Currently it's domain/product-name NOT domain/category/product name is this bad?
I have a eCommerce site and the site structure is domain/product-name rather than domain/product-category/product-name Do you think this will have a negative impact SEO Wise? I have seen that some of my individual product pages do get better rankings than my categories.
Technical SEO | | the-gate-films0 -
"HTTP error: 404 not found" submitting YOAST SITEMAP
When I upload the YOAST site map to google webmaster i get "HTTP error: 404 not found" just for the portfolio tag and categories..For other things iI dont get any i kinf of errors Is it because i dont have any tags and categories of portfolio element? I have to say in my template I have the portfolio post option but im not using it. Tx
Technical SEO | | tourtravel0 -
Can't get Google to Index .pdf in wp-content folder
We created an indepth case study/survey for a legal client and can't get Google to crawl the PDF which is hosted on Wordpress in the wp-content folder. It is linked to heavily from nearly all pages of the site by a global sidebar. Am I missing something obvious as to why Google won't crawl this PDF? We can't get much value from it unless it gets indexed. Any help is greatly appreciated. Thanks! Here is the PDF itself:
Technical SEO | | inboundauthority
http://www.billbonebikelaw.com/wp-content/uploads/2013/11/Whitepaper-Drivers-vs-cyclists-Floridas-Struggle-to-share-the-road.pdf Here is the page it is linked from:
http://www.billbonebikelaw.com/resources/drivers-vs-cyclists-study/0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
OMG!! 1300 404 Errors. HELP ME!!!
Day by day google detecting 404 errors urls. Currently it is cross 1340 urls. Please help me to get out of this shit. You can check the screenshot here You can see the screenshot here- http://img856.imageshack.us/img856/429/954b503e0781462c8a15774.png Please check the website - www.plugnbuy.com Kindly help me. I use nofollow tag but still don't know why google detecting those errors.
Technical SEO | | chandubaba1 -
Google insists robots.txt is blocking... but it isn't.
I recently launched a new website. During development, I'd enabled the option in WordPress to prevent search engines from indexing the site. When the site went public (over 24 hours ago), I cleared that option. At that point, I added a specific robots.txt file that only disallowed a couple directories of files. You can view the robots.txt at http://photogeardeals.com/robots.txt Google (via Webmaster tools) is insisting that my robots.txt file contains a "Disallow: /" on line 2 and that it's preventing Google from indexing the site and preventing me from submitting a sitemap. These errors are showing both in the sitemap section of Webmaster tools as well as the Blocked URLs section. Bing's webmaster tools are able to read the site and sitemap just fine. Any idea why Google insists I'm disallowing everything even after telling it to re-fetch?
Technical SEO | | ahockley0 -
We registered with Yahoo Directory. Why won't this show up as a a linking root domain in our link analysis??
Recently checked our link analysis report for 2 of our campaigns who are registered in the dir.yahoo.com (yahoo directory). For some reason, we don't see this being a domain that shows up as linking to our website - why is this?
Technical SEO | | MMP0