Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Server is taking too long to respond - What does this mean?
-
A client has 3 sites that he would like for me to look at. Whenever I attempt to on my home internet I get this message:
The connection has timed out
The server is taking too long to respond.When I take my iphone off wifi and use AT&T, the site comes up fine. What is going on here?
-
More than likely it was one 3 things, a DNS issue, a peering issue, or a temp ban.
If you were currently ftp'ing into the site and had too many threads open, usually above 4 or 5 but all depends on the server setting. They can issue a temporary ban on your ip address for the site. Depending on how the server is set up, you can either get an explicit message, which is bad. Or you can just get an error like you, which is good and it means the server is shedding the load.
A DNS issue could be that a name server is down somewhere or having other problems. You generally cannot do anything about this and they are generally fixed quickly because of the amount of sites / information hosted on them is vital.
A peering problem, like a DNS issue is usually spotty. More than likely that is what was happening. A peering issue means you cannot access the "chunk" of internet that the peer directs traffic through. So say you can access 99.9% of everything you want, because it is not going through the peer with the issues.
The best tools you can use to diagnose these problems are TOR, it is a socks proxy that routes your traffic so essentially you will be accessing the site through another isp, who could not be having peering or DNS issues with the hosting isp. Also you can use http://www.whatsmydns.net/ which will let you know what different dns servers around the world are returning. It will let you know if a major DNS server is having an issue. For general checking you can use this as well, http://www.downforeveryoneorjustme.com/
-
Check with the IT folks or hosting service for your client. I think this is an outside chance, but if you have been running spiders from your home computer to check the site, you may have been hitting it too hard and slowed the site down and the server may be blocking your IP as you are seen as a spammer. That is why you change ISPs you are golden as you are seen as a different "user".
I took down one of our sites once with a spidering tool. They were pushing new code right when I hit the site. Also, the number of requests a second I thought were ok, well, it was during peak traffic time. (DOH!)
I adjusted my crawl rate down and everything was ok. Again, this is just a guess, but worth checking considering your symptoms.
Good luck!
-
Yeah they all work for me too.
So this remains one of the weirder topics on here but for different reasons than I first suspected. ..I'm really not sure what to tell you. Sorry.
-
They all work for me the topsmagic site takes a while to load though
-
-
that's weird. what are the domains let's see if I can access them?
-
Wait are you saying this is just for your clients' sites? You can access other sites just fine? That's how you posted this question?
Sorry i'm confused.
-
My internet is working fine. I'm on moz.org right now using my internet. It's only when I attempt to visit those 3 websites.
-
Your internet and/or router is down..? Yeah I'd power-cycle the router and modem and try again. Or contact your cable company.
No offense but this is one of the weirdest Q&A posts I've seen here. I'm having a weird morning though so it totally fits.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
An immediate and long-term plan for expired Events?
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague): PROBLEM I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages. The problem? Almost all of these have expired, leading to a little bit of index bloat. THINGS TO CONSIDER A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires. About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event. QUESTIONS I'M ASKING How do we address all these old events that provide no real value to the user? What should a future process look like to prevent this from happening? MY SOLUTION Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users. Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them. Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired. Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index. Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go. Thanks. Eager to hear all your thoughts.
Technical SEO | | Alces0 -
URL too long. Shorten and redirect, or leave alone?
MOZ is indicating that i have several URLs that are too long. Should I shorten the URLs and redirect the long URLs to the new, shorter, URL? Or should i leave them alone, as I've been reading to avoid redirects.
Technical SEO | | Hanover4401 -
Multiple robots.txt files on server
Hi! I have previously hired a developer to put up my site and noticed afterwards that he did not know much about SEO. This lead me to starting to learn myself and applying some changes step by step. One of the things I am currently doing is inserting sitemap reference in robots.txt file (which was not there before). But just now when I wanted to upload the file via FTP to my server I found multiple ones - in different sizes - and I dont know what to do with them? Can I remove them? I have downloaded and opened them and they seem to be 2 textfiles and 2 dupplicates. Names: robots.txt (original dupplicate)
Technical SEO | | mjukhud
robots.txt-Original (original)
robots.txt-NEW (other content)
robots.txt-Working (other content dupplicate) Would really appreciate help and expertise suggestions. Thanks!0 -
Redirecting HTTP to HTTPS - How long does it take Google to re-index the site?
hello Moz We know that this year, Moz changed its domain to moz.com from www.seomoz.org
Technical SEO | | joony
however, when you type "site:seomoz.org" you still can find old urls indexed on Google (on page 7 and above) We also changed our site from http://www.example.com to https://www.example.com
And Google is indexing both sites even though we did proper 301 redirection via htaccess. How long would it take Google to refresh the index? We just don't worry about it? Say we redirected our entire site. What is going to happen to those websites that copied and pasted our content? We have already DMCAed their webpages, but making our site https would mean that their website is now more original than our site? Thus, Google assumes that we have copied their site? (Google is very slow on responding to our DMCA complaint) Thank you in advance for your reply.0 -
How do I fix Title Element Too Long (> 70 Characters) errors?
I have read through other postings about this issue, but since I have over 400 of these errors I am interested in seeing how to fix this issue quickly. Does anyone have a ideas or suggetions? Thanks, Lisa
Technical SEO | | lisarein0 -
Rel="Follow"? What the &#@? does that mean?
I've written a guest blog post for a site. In the link back to my site they've put a rel="follow" attribute. Is that valid HTML? I've Googled it but the answers are inconclusive, to say the least.
Technical SEO | | Jeepster0 -
How long will Google take to stop crawling an old URL once it has been 301 redirected
I need to do a clean-up old urls that have been redirected in sitemap and was wondering about this.
Technical SEO | | Ant-8080 -
Using a third party server to host site elements
Hi guys - I have a client who are recently experiencing a great deal of more traffic to their site. As a result, their web development agency have given them a server upgrade to cope with the new demand. One thing they have also done is put all website scripts, CSS files, images, downloadable content (such as PDFs) - onto a 3rd party server (Amazon S3). Apparently this was done so that my clients server just handles the page requests now - and all other elements are then grabbed from the Amazon s3 server. So basically, this means any HTML content and web pages are still hosted through my clients domain - but all other content is accessible through an Amazon s3 server URL. I'm wondering what SEO implications this will have for my clients domain? While all pages and HTML content is still accessible thorugh their domain name, each page is of course now making many server calls to the Amazon s3 server through external URLs (s3.amazonaws.com). I imagine this will mean any elements sitting on the Amazon S3 server can no longer contribute value to the clients SEO profile - because that actual content is not physically part of their domain anymore. However what I am more concerned about is whether all of these external server calls are going to have a negative effect on the web pages value overall. Should I be advising my client to ensure all site elements are hosted on their own server, and therefore all elements are accessible through their domain? Hope this makes sense (I'm not the best at explaining things!)
Technical SEO | | zealmedia0