5XX (Server Error) on all urls
-
Hi
I created a couple of new campaigns a few days back and waited for the initial crawl to be completed. I have just checked and both are reporting 5XX (Server Error) on all the pages it tried to look at (one site I have 110 of these and the other it only crawled the homepage). This is very odd, I have checked both sites on my local pc, alternative pc and via my windows vps browser which is located in the US (I am in UK) and it all works fine.
Any idea what could be the cause of this failure to crawl? I have pasted a few examples from the report
|
500 : TimeoutError
http://everythingforthegirl.co.uk/index.php/accessories.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/bags.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/gloves.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/purses.html 500 1 0 500 : TimeoutError
http://everythingforthegirl.co.uk/index.php/accessories/sunglasses.html | 500 | 1 | 0 |
Am extra puzzled why the messages say time out. The server dedicated is 8 core with 32 gb of ram, the pages ping for me in about 1.2 seconds. What is the rogerbot crawler timeout?
Many thanks
Carl
-
You're absolutely correct, hopefully this answered your question!
-
Thanks, will check out that plugin. So, in other words, the pages are loading fine for the user but sending out an error to the bots instead of the loaded ok message. That doesn't sound good!!
On the plus side, at least it has stopped Roger noticing some of the pages have up to 600 links on them because of all the retailer and manufacturer filtering options!!
Many thanks, Carl
-
Hi Carl,
You're a lucky man (sarcastic), your paging are loading just normally but are indeed giving the wrong status code: 500 for me. This is probably caused by some of the settings in Magento or you server as the normal status code for working pages should be a 200 OK.
That's probably also why Rogerbot didn't timeout on the pages but got a 500 while the pages were working. Good luck fixxing this!
Btw, I highly recommend using the Redirect Plugin for Chrome by Ayima.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WWW used in research URL, or not to WWW
Long time user, infrequent poster.... thanks for taking my question... When I go to gather a series of data elements on a company's URL, the data changes (sometime dramatically) depending on whether the 'www.' is added to the URL & it seems related more to Page data than Domain. My question is about which data I should be using to assess the real strength of the site / page? Is there a 'best practice' question here, a personal preference or is there an actual difference in the performance of the www vs the non-www version? aquGYdz
Moz Pro | | SWGroves0 -
What is the best approach to handling 404 errors?
Hello All - I'm a new here and working on the SEO on my site www.shoottokyo.com. When I am finding 4xx (Client Errors) what is the best way to deal with them? I am finding an error like this for example: http://shoottokyo.com/2010/11/28/technology-and-karma/ This may have been caused when I updated my permalinks from shoottokyo.com/2011/09/postname to shoottokyo.com/postname. I was using the plug in Permalinks moved permanently to fix them. Sometimes I am able to find http://shoottokyo.com/a-very-long-week/www.newscafe.jp and I can tell that I simply have a bad link to News Cafe and I can go to the post and correct it but in the case of the first one I can't find out where the crawler even found the problem. I'm using Wordpress. Is it best to just use a plugin like 'Redirection' to move the rest that have errors where I cannot find the source of the issue? Thanks Dave
Moz Pro | | ShootTokyo0 -
Why is moz saying I have a 404 error?
I recently asked a question on here about how to fix my 404 error. I figured it out, and now webmaster tools says it's not there any more. This as been confirmed by keeping watch for about a week now. But Moz just crawled my site today and is for some reason showing an error. Am i missing something here or is Moz a little slow sometimes?
Moz Pro | | NateStewart0 -
Crawl Errors and Notices drop to zero
Hi all, After setting up a campaign in Moz the crawl is successful and it showed the Errors and Warnings in crawl diagnostics (each one had about 40-50), but after a few days the number dropped to zero. Only the "notices" seems to stay normal, with a slight drop since the campaign set up, but not dropping to zero. I set this campaign up in a colleague's account and the same thing happened shortly after set up. I didn't find any Q&A already posted so any insight is appreciated!
Moz Pro | | Vanessa120 -
Canonical URLs and Duplicate Page Content
My website (doctor directory) is getting a lot of duplicate page content & duplicate page title warnings from SEOmoz. The pages that are getting the warnings are doctors profiles which can be accessed at three different URLs. Problem is this should be handled by the canonical tag on the pages. So example below, all three open the same page: https://www.arzttermine.de/arzt/dr-sara-danesh/ https://www.arzttermine.de/arzt/dr-sara-danesh/gkv https://www.arzttermine.de/arzt/dr-sara-danesh/pkv Here's our canonical tag (on line 34): rel="canonical" href="http://www.arzttermine.de/arzt/dr-sara-danesh" /> So why is SEO moz crawling the page? We are getting hundreds of errors from this - and yet Google doesn't have any of the duplicate URLs indexed...
Moz Pro | | thomashillard0 -
Dead links-urls
What is the quickest way to get Google to clean up dead
Moz Pro | | 1step2heaven
link? I have 74,000 dead links reported back, i have added a robot txt to
disallow and added on Google list remove from my webmaster tool 4 months ago.
The same dead links also show on the open site explores. Thanks0 -
On-Page URL
Hopefully I am missing something basic... I can't see how to specifically add and delete On-Page reports. It seems like running a report adds it but how to delete? Also, how does one change the URL for a report? I have re-organized some pages and can't seem the get the on-page report to keep my URL change. Here is what I tried. From the On-Page report card for a keyword I changed the URL and ran the test. Test runs ok but if I navigate back to the summary my old bad URL is still there.
Moz Pro | | Banknotes0 -
How come when I export a error list I can only export the first page?
I am working on fixing the 4xx errors. I have found the easiest way to do this would be to export the list, print it out, and check off the ones i've fixed. The site only lets me export the first page. We'll appreciate any help. Thanks, Ryan D. Gran --Not sure what category this question belongs in so selected SEOmoz Tools--
Moz Pro | | dggusmc0