5XX (Server Error) on all urls
-
Hi
I created a couple of new campaigns a few days back and waited for the initial crawl to be completed. I have just checked and both are reporting 5XX (Server Error) on all the pages it tried to look at (one site I have 110 of these and the other it only crawled the homepage). This is very odd, I have checked both sites on my local pc, alternative pc and via my windows vps browser which is located in the US (I am in UK) and it all works fine.
Any idea what could be the cause of this failure to crawl? I have pasted a few examples from the report
|
500 : TimeoutError
http://everythingforthegirl.co.uk/index.php/accessories.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/bags.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/gloves.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/purses.html 500 1 0 500 : TimeoutError
http://everythingforthegirl.co.uk/index.php/accessories/sunglasses.html | 500 | 1 | 0 |
Am extra puzzled why the messages say time out. The server dedicated is 8 core with 32 gb of ram, the pages ping for me in about 1.2 seconds. What is the rogerbot crawler timeout?
Many thanks
Carl
-
You're absolutely correct, hopefully this answered your question!
-
Thanks, will check out that plugin. So, in other words, the pages are loading fine for the user but sending out an error to the bots instead of the loaded ok message. That doesn't sound good!!
On the plus side, at least it has stopped Roger noticing some of the pages have up to 600 links on them because of all the retailer and manufacturer filtering options!!
Many thanks, Carl
-
Hi Carl,
You're a lucky man (sarcastic), your paging are loading just normally but are indeed giving the wrong status code: 500 for me. This is probably caused by some of the settings in Magento or you server as the normal status code for working pages should be a 200 OK.
That's probably also why Rogerbot didn't timeout on the pages but got a 500 while the pages were working. Good luck fixxing this!
Btw, I highly recommend using the Redirect Plugin for Chrome by Ayima.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difference between urls and referring urls?
Sorry, nit new to this side of SEO We recently discovered we have over 200 critical crawler issues on our site (mainly 4xx) We exported the CSV and it shows both a URL link and a referring URL. Both lead to a 'page not found' so I have two questions? What is the difference between a URL and a referring URL? What is the best practice/how do we fix this issue? Is it one for our web developer? Appreciate the help.
Moz Pro | | ayrutd1 -
Ive been using moz for just a minute now , i used it to check my website and find quite a number of errors , unfortunately i use a wordpress website and even with the tips , is till dont know how to fix the issues.
ive seen quite a number of errors on my website hipmack.co a wordpress website and i dont know how to begin clearing the index errors or any others for that matter , can you help me please? ghg-1.jpg
Moz Pro | | Dogara0 -
Warnings, Notices, and Errors- don't know how to correct these
I have been watching my Notices, Warnings and Errors increase since I added a blog to our WordPress site. Is this effecting our SEO? We now have the following: 2 4XX errors. 1 is for a page that we changed the title and nav for in mid March. And one for a page we removed. The nav on the site is working as far as I can see. This seems like a cache issue, but who knows? 20 warnings for “missing meta description tag”. These are all blog archive and author pages. Some have resulted from pagination and are “Part 2, Part 3, Part 4” etc. Others are the first page for authors. And there is one called “new page” that I can’t locate in our Pages admin and have no idea what it is. 5 warnings for “title element too long”. These are also archive pages that have the blog name and so are pages I can’t access through the admin to control page title plus “part 2’s and so on. 71 Notices for “Rel Cononical”. The rel cononicals are all being generated automatically and are for pages of all sorts. Some are for a content pages within the site, a bunch are blog posts, and archive pages for date, blog category and pagination archive pages 6 are 301’s. These are split between blog pagination, author and a couple of site content pages- contact and portfolio. Can’t imagine why these are here. 8 meta-robot nofollow. These are blog articles but only some of the posts. Don’t know why we are generating this for some and not all. And half of them are for the exact same page so there are really only 4 originals on this list. The others are dupes. 8 Blocked my meta-robots. And are also for the same 4 blog posts but duplicated twice each. We use All in One SEO. There is an option to use noindex for archives, categories that I do not have enabled. And also to autogenerate descriptions which I do not have enabled. I wasn’t concerned about these at first, but I read these (below) questions yesterday, and think I'd better do something as these are mounting up. I’m wondering if I should be asking our team for some code changes but not sure what exactly would be best. http://www.seomoz.org/q/pages-i-dont-want-customers-to-see http://www.robotstxt.org/meta.html Our site is http://www.fateyes.com Thanks so much for any assistance on this!
Moz Pro | | gfiedel0 -
Blog Page URLs Showing Duplicate Content
On the SEOMoz Crawl Diagnostics, we are receiving information that we have duplicate page content for the URL Blog pages. For Example: blog/page/33/ blog/page/34/ blog/page/35/ blog/page/36/ These are older post in our blog. Moz is saying that these are duplicate content. What is the best way to fix the URL structure of the pages?
Moz Pro | | _Thriveworks0 -
How to run down the actual source of a 404 error that is reported.
In my 404 errors, the second entry is as follows: URL: http://www.virginiahomesandforeclosures.com/listing/0428387-lot-k-commerce-park-franklin-va-23851/REWIDX_URL_CDNimg/no-image.gif Is there a simple way to find the root or page in which this error was generated? IF I visit this page " http://www.virginiahomesandforeclosures.com/listing/0428387-lot-k-commerce-park-franklin-va-23851" without the attached gobble de gook, I see a good page. So bottom line its possible it could be in one of my sitemaps, but I have 50 of those so its time consuming to search thru all 50 for each error like this since I have so many. I am pretty sure its not in my sitemaps, since google has not picked up any of these errors and they have crawled over 12,000 urls so far. When google gives me a 404 error I can click on the link and find what pages they found the link and go there and correct it at the root. Any suggestions would be greatly appreciated. I have more than 1,000 of these errors with the bad url with the junk attached to the end and have not been able to isolate the cause yet. Thanks in advance.
Moz Pro | | tommytx0 -
404 errors
I have a few 404 errors found using seomoz tool. It also shows the url, but I am not sure where in the site is it linking from. Is there a way to find out the origin of the 404 error. Thanks
Moz Pro | | Accounts0 -
Company Name in Page Title creating thousands of "Duplicate Page Title" errors
I am new, and I just got back my crawl results (after a week or more). The first thing I noticed is that the "duplicate page title" is in the thousands, my urls and page titles are different. The only thing I can see is that our company name is at appended to the name of every title. I did search and found one other person with this problem, but no answer was given. Can anyone offer some advice? This doesn't seem right... Thanks,
Moz Pro | | AoyamaJPN0 -
SEOMoz site crawlers created an issue for our servers
I have set up a number of campaigns with your pro tool. Unfortunately we have 7 sites on our server and our IT dept have said that we had an issue when your site crawlers visited for several sites at the same time - is there any way that I can retain the campaigns but have the sites crawled on request rather than automatically?
Moz Pro | | StephenALee0