Server is taking too long to respond - What does this mean?
-
A client has 3 sites that he would like for me to look at. Whenever I attempt to on my home internet I get this message:
The connection has timed out
The server is taking too long to respond.When I take my iphone off wifi and use AT&T, the site comes up fine. What is going on here?
-
More than likely it was one 3 things, a DNS issue, a peering issue, or a temp ban.
If you were currently ftp'ing into the site and had too many threads open, usually above 4 or 5 but all depends on the server setting. They can issue a temporary ban on your ip address for the site. Depending on how the server is set up, you can either get an explicit message, which is bad. Or you can just get an error like you, which is good and it means the server is shedding the load.
A DNS issue could be that a name server is down somewhere or having other problems. You generally cannot do anything about this and they are generally fixed quickly because of the amount of sites / information hosted on them is vital.
A peering problem, like a DNS issue is usually spotty. More than likely that is what was happening. A peering issue means you cannot access the "chunk" of internet that the peer directs traffic through. So say you can access 99.9% of everything you want, because it is not going through the peer with the issues.
The best tools you can use to diagnose these problems are TOR, it is a socks proxy that routes your traffic so essentially you will be accessing the site through another isp, who could not be having peering or DNS issues with the hosting isp. Also you can use http://www.whatsmydns.net/ which will let you know what different dns servers around the world are returning. It will let you know if a major DNS server is having an issue. For general checking you can use this as well, http://www.downforeveryoneorjustme.com/
-
Check with the IT folks or hosting service for your client. I think this is an outside chance, but if you have been running spiders from your home computer to check the site, you may have been hitting it too hard and slowed the site down and the server may be blocking your IP as you are seen as a spammer. That is why you change ISPs you are golden as you are seen as a different "user".
I took down one of our sites once with a spidering tool. They were pushing new code right when I hit the site. Also, the number of requests a second I thought were ok, well, it was during peak traffic time. (DOH!)
I adjusted my crawl rate down and everything was ok. Again, this is just a guess, but worth checking considering your symptoms.
Good luck!
-
Yeah they all work for me too.
So this remains one of the weirder topics on here but for different reasons than I first suspected. ..I'm really not sure what to tell you. Sorry.
-
They all work for me
the topsmagic site takes a while to load though
-
-
that's weird. what are the domains let's see if I can access them?
-
Wait are you saying this is just for your clients' sites? You can access other sites just fine? That's how you posted this question?
Sorry i'm confused.
-
My internet is working fine. I'm on moz.org right now using my internet. It's only when I attempt to visit those 3 websites.
-
Your internet and/or router is down..? Yeah I'd power-cycle the router and modem and try again. Or contact your cable company.
No offense but this is one of the weirdest Q&A posts I've seen here. I'm having a weird morning though so it totally fits.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
An immediate and long-term plan for expired Events?
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague): PROBLEM I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages. The problem? Almost all of these have expired, leading to a little bit of index bloat. THINGS TO CONSIDER A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires. About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event. QUESTIONS I'M ASKING How do we address all these old events that provide no real value to the user? What should a future process look like to prevent this from happening? MY SOLUTION Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users. Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them. Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired. Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index. Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go. Thanks. Eager to hear all your thoughts.
Technical SEO | | Alces0 -
301 redirects- how long to keep and how many are too many?
Hi, I was told we have way too many 301 redirects on our site. We have some that have been there for 3 years. Our site is datacard.com . Question- how long should you keep a redirect out there when building a new page and expiring an old page? Is it 6 months, is it a certain time frame? wondering what the best practices are? Thanks! Laura
Technical SEO | | lauramrobinson320 -
Page for page 301 redirects from old server to new server
Hi guys:
Technical SEO | | cindyt-17038
I have a client who is moving their entire ecommerce site from one hosting platform (Yahoo Store) to another (BigCommerce) and from one domain to another. The old domain is registered with the Yahoo as of yesterday and we have redirected the old domain (at the domain level) to the new domain. However, we are having trouble getting the pages to redirect page for page. Currently they are all redirecting to the new domain home page. We did just move the old domain from GoDaddy to Yahoo yesterday thinking this would solve it however as of this morning the old pages are still redirecting to the home page of the new domain. To complete the 301 redirect picture, we uploaded the redirects (all relative links for both from and to) to BigCommerce. And while the domain was hosted at GoDaddy with a redirect to the new domain, they were working. We moved the domain to Yahoo because of email issues thinking it should still work. Is it possibly just a waiting game now as the change populates across the DNS? old url to test:
rock-n-roll-action-figures.com/fender-jazz-bass-miniature-guitar-replica-classic-red-finish.html0 -
Best way to noindex long dynamic urls?
I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing. Any assistance will be greatly appreciated : )
Technical SEO | | wcbuckner0 -
How long does it take to reindex the website
Generally speaking, how long does it take for Google to recrawl/reindex an (ecommerce) website? After changing a number of product subcategories from 'noindex' back to 'index', I regenerated the sitemap and have fetched as Google in WMT. This was a couple of weeks ago and no action yet. Second question: Does Google treat these pages as if they're brand new? I 'noindexed' them back in April, and they were ranking ok then. (I had noindexed them on the back of advice from my SEO, due to concerns about these pages being seen as duplicate content). Help!
Technical SEO | | Coraltoes770 -
Do user metrics really mean anything?
This is a serious question, I'd also like some advice on my experience so far with the Panda. One of my websites, http://goo.gl/tFBA4 was hit on January 19th, it wasn't a massive hit, but took us from 25,000 to 21,000 uniques per day. It survived Panda completely prior. The only thing that had changed, was an upgrade in the CMS, which caused a lot of duplicate content, i.e 56 copies of the homepage, under various URLs. These were all indexed in Google. I've heard varying views, as to whether this could trigger Panda, I believe so, but i'd appreciate your thoughts on it. There was also the above the fold update on the 19th, but we have 1 ad MAX on each page, most pages have none. I hate even having to have 1 ad. I think we can safely assume it was Panda that did the damage. Jan 18th was the first Panda refresh, since we upgraded our CMS in mid-late December. As it was nothing more than a refresh, I feel it's safe to assume, that the website was hit, due to something that had changed on the website, between the Jan 18th refresh and the one previous. So, aside from fixing the bugs in the CMS, I felt now was a good time to put a massive focus on user metrics, I worked hard and continuing to spend a lot of time, improving them. Reduced bounce rate from 50% to 30% (extremely low in the niche) Average page views from 7 to 12 Average time on site from 5 to almost 8 minutes Plus created a mobile optimised version of the site Page loading speeds slashed. Not only did the above improvements have no positive effect, traffic continued to slide and we're now close to a massive 40% loss. Btw I realise neither mobile site nor page loading speeds are user metrics. I fully appreciate that my website is image heavy and thin on text, but that is an industry wide 'issue'. It's not an issue to my users, so it shouldn't be an issue to Google. Unlike our competitors, we actively encourage our users to add descriptions to their content and provide guidelines, to assit them in doing so. We have a strong relationship with our artists, as we listen to their needs and develop the website accordingly. Most of the results in the SERPs, contain content taken from my website, without my permission or permission of the artist. Rarely do they give any credit. If user metrics are so important, why on earth has my traffic continued to slide? Do you have any advice for me, on how I can further improve my chances of recovering from this? Fortunately, despite my artists download numbers being slashed in half, they've stuck by me and the website, which speaks volumes.
Technical SEO | | seo-wanna-bs0 -
Server crashed - What should I do regarding Google SERP´s?
We have several travel websites in Uruguay since 2003. These sites have a very high PR and Trust. The server where all our sites are hosted has crashed and we have been for 2 days now trying to fix all this mess. We hope this problem will be fixed today. Please I really need to know what should I do regarding Google. I mean one of our sites has been ranking in top 1 positions for more than 150 keywords. Will we loose all that? What can we do about it? It´s the first time this has happened to our sites.
Technical SEO | | ceci27100 -
Does PR actual mean anything?
My website is really ranking well, even product pages have a PR of 4. I ran the keyword analysis tool for the product page in question (which has a PR of 4) with the targeted keyword for the page which is currently ranking 8th (not so good) the keywords analysis tool give me some data back with the page authority being 1 and it also mentions that it has 0 page link root domains, however I know this is not true. I expected the page authority in seomoz to be higher than 1 due to it having a PR of 4, so how valuable is PR? does it even matter? Another thing I noticed is that the website above my listing have PR of 1 and 2. Looking forward to some feedback
Technical SEO | | Paul780