How is this possible? A 200 response and 'nothing' to be seen? Need help!
-
On checking this website http://dogtraining.org.uk/ I get a 200 response. But an Oops! Google Chrome could not find dogtraining.org.uk . Same with Firefox (Server not found).
Obviously there is a problem - I just don't know where to 'start' investigating to spot the error.
Can someone help me?
Thank you!
-
Fantastic, guys, thank you!
Yes, I have a slow connection (getting it sorted soon). That's why I couldn't 'see' there 'is' a website responding correctly, although slow to load.
Ok, this gives me an excellent starting point. I'll run both pingdom and Google's speed check and see what comes up.
Thanks again!
Happy friday
-
I ran a speed test on the domain as Chris mentioneed it was running slow. I did get the domain to load, but it took a lot of time to get a visual of the site's design. Try using the following tool to run some speed tests, determine where things might be slowing down (host, server, # of files loading, # of image files loading, quality of images, resolutions, remote files and CCS scripts, etc).. could be a # of things but this is a good place to start investigating. Just enter your domain here and run the test.
It will also help you identify areas that might need looking at to help speed things up. Hope this is a good jumping off point
Cheers
-
I got the page but it was slow to be found. Maybe it is timing out for you and could be an issue with the quality of your host.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Re-direct help
Hello Mozzers, I have a technical question that perhaps someone has experience with and can help with. I currently have 2 e-commerce websites: SITE-A.COM (original site) & SITE-B.COM (new site) SITE-B.COM is the newer site that has a lot of new products and new features and great content and is very user friendly. We are thinking about funneling all of our visitors and traffic to SITE-B.com since it is the better experience for the users ... the question is this: If we want to 301 redirect all traffic from Site-A.com to Site-B.com ... where do we initiate those redirect requests? Would it be on the server for Site-A.com? If so, would i have to keep that server up and running forever if i don't want to lose the re-directs? Also, how do i do this properly without violating Google's guidelines? Any help is appreciated. Thanks
Technical SEO | | Prime850 -
Possible scraper reusing content. Should I be concerned?
I've noticed a few overseas sites seem to be repurposing content from our blog. The process to report for DMCA seems lengthy. Should I be concerned enough to persue this or just write it off as something that happens? Here's an original - http://www.martinsprocket.com/sprocket-sense/sprocket-sense/2015/12/11/free-sprocket-CAD-models Here's an example - http://ptech.in/silica-crushing/free-martin-sprocket-autocad-drawing-download-martin.html Thanks! f9Wfk2h
Technical SEO | | sprockets0 -
Is this going to be seen by google as duplicate content
Hi All, Thanks in advance for any help that you can offer in regards to this. I have been conducted a bit of analysis of our server access file to see what googlebot is doing, where it is going etc. Now firstly, I am not SEO but have an interest. What I am seeing a lot of is that we have URL's that have an extension that sets the currency that is displayed on the products so that we can conduct Adwords campaigns in other countries, these show as follows: feedurl=AUD, feedurl=USD, feedurl=EUR etc. What I can see is that google bot is hitting a URL such as /some_product, then /someproduct?feedurl=USD and then /someproduct?feedurl=EUR and then /someproduct?feedurl=AUD all after each other. Now this is the same product page and just has the price shown slightly different on each. Would this count as a duplicate content issue? Should I disavow feedurl? Any assistance that you can offer would be greatly appreciated. Thanks, Tim
Technical SEO | | timsilver0 -
GWT returning 200 for robots.txt, but it's actually returning a 404?
Hi, Just wondering if anyone has had this problem before. I'm just checking a client's GWT and I'm looking at their robots.txt file. In GWT, it's saying that it's all fine and returns a 200 code, but when I manually visit (or click the link in GWT) the page, it gives me a 404 error. As far as I can tell, the client has made no changes to the robots.txt recently, and we definitely haven't either. Has anyone had this problem before? Thanks!
Technical SEO | | White.net0 -
Canonicalization help
Hi Moz Community, If I have two different sub-category pages: http://www.example.com/rings/anniversary-rings/
Technical SEO | | IceIcebaby
http://www.example.com/wedding/anniversary-rings/ And the first one is ranking for all KWs, should I add a rel=canonical to the second URL or leave it since it's slightly different? Or should I try and create different unique content for the second URL? Everything in terms of content is the same on both these pages except for the URLs, which aren't that different to begin with. Thanks for your help! -Reed0 -
Need help optimizing Windows IIS server for SEO
My web site, www.nhfinehomes.com, is running on IIS7 and I did read a great post on SEOMoz.org regarding how to optimize IIS for SEO in particular, redirecting URL's to lowercase properly. However, I lack the technical skills to do this and am looking for someone who has done this before that can consult on this. Can anyone help or recommend a consultant with actual, IIS SEO experience?
Technical SEO | | LinkMoser0 -
What if 404 Error not possible?
Hi Everyone, I get an 404 error in my page if the URL is simply wrong, but for some parameters, like if a page has been deleted, or has expired, I get an error page indicating that the ID is wrong, but no 404 error. It is for me very difficult to program a function in php that solve the problem and modify the .htaccess with the mod_rewrite. I ask the developer of the system to give a look, but I am not sure if I will get an answer soon. I can control the content of the deleted/expired page, but the URL will be very similar to those that are ok (actually the url could has been fine, but now expired). Thinking of solutions I can set the expired/deleted pages as noindex, would it help to avoid duplicated title/description/content problem? If an user goes to i.e., mywebsite.com/1-article/details.html I can set the head section to noindex if it has expired. Would it be good enough? Other question, is it possible anyhow to set the pages as 404 without having to do it directly in the .htacess, so avoiding the mod_rewrite problems that I am having? Some magical tag in the head section of the page? Many thanks in advance for your help, Best Regards, Daniel
Technical SEO | | te_c0 -
Removing a site from Google's index
We have a site we'd like to have pulled from Google's index. Back in late June, we disallowed robot access to the site through the robots.txt file and added a robots meta tag with "no index,no follow" commands. The expectation was that Google would eventually crawl the site and remove it from the index in response to those tags. The problem is that Google hasn't come back to crawl the site since late May. Is there a way to speed up this process and communicate to Google that we want the entire site out of the index, or do we just have to wait until it's eventually crawled again?
Technical SEO | | issuebasedmedia0