Thousands of 503 Errors
-
I was just checking Google Webmaster Tools for one of the first times (I know this should have been a regular habit).
I noticed that on Feb 8th we had almost 80K errors of type 503. This is obviously very alarming because as far as I know our site was up and available that whole day. This makes me wonder if there is a firewall issue or something else that I'm not aware of.
Any ideas for the best way to determine what's causing this?
Thanks,
Chris
-
Cyrus,
Thanks for the props, but also cool on the crawl delay link. I wish I could say I knew about it before this answer, but I didn't; cool stuff for bigger high update sites.
Always appreciate what you have to say as I learn a lot from you.
Best
-
Hi Chris,
This is a really hard problem to diagnose from the outside like, so I'll just give you my thoughts.
1. Are the URLs throwing the 503 errors real pages? Can they be accessed normally by human visitors through the site? I only mention this because sometimes you get software generating a bunch of random links that go nowhere, and weird stuff starts to happen when Google crawls those URLs. Normally you'd see these result as 404s, however.
2. Is the date in Google Webmaster Tools for the 503 errors recent? Sometimes they log those for a long time after the problem is actually solved, especially for URLs they don't visit much.
3. How often does your site go down?
4. Try performing a "Fetch as Googlebot" test on some of the effected URLs
5. I doubt googlebot is crashing your site, but you could always try a crawl delay
6. If nothing else, you'll find the problem at the serving/hosting level. Can't be much help there, unfortunately.
-
It turns out that the Magento patch did NOT fix the problem. We are still receiving tens of thousands of 503 errors when Googlebot requests a page. The site is not down. I can look in the access_log and see that the request was responded to with a 503 error.
Any ideas? This has to be killing our chances for organic traffic until this gets resolve.
-
Hi Robert,
Thanks for the response. It turns out that this is due to a bug in our hosting software, Magento, that results in googlebot not being handled correctly. Apparently there's a patch that's being tested now.
Thanks,
Chris
-
Do you know where you are hosted? Have you called them to see if the server is down or intermittently down?
Here is a how to resolve link.
Look at bottom and follow the directions regarding using the wayback machine to see if it is temporary or the server is down for maintenance.
That given, if you give us a url, it is easier to assist you.
Best, let us know.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
500 - server error
Hi All, A site crawl reveals several server errors (status code 500) about a clients wordpress website. My question: what are the most common causes for server errors and what advice can I give about how to fix them? Thanks in advance,
Technical SEO | | WeAreDigital_BE
Jens0 -
Www vs non www - Crawl Error 902
I have just taken over admin of my company website and I have been confronted with crawl error 902 on the existing campaign that has been running for years in Moz. This seems like an intermittent problem. I have searched and tried to go over many of the other solutions and non of them seem to help. The campaign is currently set-up with the url http://companywebsite.co.uk when I tried to do a Moz manual crawl using this URL I got an error message. I changed the link to crawl to http://www.companywebsite.co.uk and the crawl went off without a hitch and im currently waiting on the results. From testing I now know that if i go to the non-www version of my companies website then nothing happens it never loads. But if I go to the www version then it loads right away. I know for SEO you only want 1 of these URLS so you dont have duplicate content. But i thought the non-www should redirect to the www version. Not just be completely missing. I tried to set-up a new campaign with the defaults URL being the www version but Moz automatically changed it to the non-www version. It seems a cannot set up a new campaign with it automatically crawling the www version. Does it sound like im out the right path to finding this cause? Or can somebody else offer up a solution? Many thanks,
Technical SEO | | ATP
Ben .0 -
Spike in server errors
Hi, we've recently changed shopping cart platforms. In doing so a lot of our URL's changed, but I 301'ed all of the significant landing pages (as determined by G Analytics) prior to the switch. However, WMT is warning me about this spike in server errors now with all the pages that no longer exist. However they are only crawling them because they used to exist/are linked from pages that used to exist. and no longer actually exist. Is this something I should worry about? Or let it run its course?
Technical SEO | | absoauto0 -
Crawl errors which ones should i sort out
Hi, just had my website updated to joomla 3.0 and i have around 4000 urls not found. now i have been told i need to redirect these but i would just like to check on here to make sure i am doing the right thing and the advice i have been given is not correct. I have been told these errors are the reason for the drop in rankings. I need to know if i should redirect all of these 4,000 urls or only the ones that are being linked to from outside of the site. I think about 3,000 of these have no links from outside of the site, but if i do not redirect them all then i am going to keep getting the error messages. around 2,000 of these url not found are from the last time we updated the site which was a couple of years ago and i thought they would have died off now. any advice on what i should do would be great
Technical SEO | | ClaireH-1848860 -
Is anyone able to check this 301 redirect for errors please?
Hi, I had a developer write a 301 wildcard for redirecting old hosted site to a new domain. Old URLS looked like /b/2039566/1/akai.html
Technical SEO | | Paul_MC
With varying letters & numbers. I have 26,000 crawl errors in GWT and I can only imagine it's because this is looping?
Can anyone advise if this would be causing grief? Thanks
Paul RewriteCond %{HTTP_HOST} ^vacuumdirect.com.au$ [OR]
RewriteCond %{HTTP_HOST} ^www.vacuumdirect.com.au$
RewriteRule ^/?$ "http://www.vacuumbag.net.au/vacuum-cleaners.html" [R=301,L] <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^p/([0-9]+)/(.*) default/$2 [R=301,L]</ifmodule> <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^c/([0-9]+)/1/(.*) default/vacuum-bags/vacuum-cleaner-bags-$2 [R=301,L]</ifmodule> <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^p/([0-9]+)/(.*) $2 [R=301,L]</ifmodule> <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^c/([0-9]+)/(.*) default/$2 [R=301,L]</ifmodule>0 -
My 404 page shows in the report as an error.
How can i make my actual 404 page not show up as a 404 error in the report?
Technical SEO | | LindseyNewman0 -
Are these 'not found' errors a concern?
Our webmaster report is showing thousands of 'not found' errors for links that show up in javascript code. Is this something we should be concerned about? Especially since there are so many?
Technical SEO | | nicole.healthline0 -
Why is it that in the exported CSV there are no refrerring pages shown for 404 errors?
Within some of my campaigns i can see issues regarding 404 pages. Then when i export the data to a csv, sometimes the referring pages that lead tot the 404 are not shown. Am i missing something here?
Technical SEO | | 5MMedia0