5xx Server Errors
-
Hey all,
Recently my site has seen a rise in 5xx errors which have all popped up since the first of the year.
Our company made a big announcement on 1/4 which drove a lot of traffic to the site, which I thought may be a reason for the sudden errors. But the errors have since continued. Also, the crawl from last week showed no errors at all (which I'm sure was a fluke)
Taking a closer look in GA, the pages that are receiving errors have not been visited for months.
Is this a random occurrence or should I consult someone to investigate our server?
Thanks,
Chris -
The crawl is showing that the errors are happening on very specific pages, so it may be a simpler solution like what you're describing.
-
One client I recently worked with had CloudFlare installed and it was causing tons of 500 errors. We removed it and they went away. Like I said see if you can identify specific pages it's happening on and isolate the code that is on those pages only. Good luck!
-
Thanks for the input, guys.
Looks like the move to make right now is double check code, then if the problem persists, dig into server issues.
-
This could also be something as simple as a bad code on some pages too. look at those old pages and see if there is a template or something associated with these pages that only they share. Might be isolating a script or disabling a plugin.
-
5xx errors refer to server errors, and can be caused by serious issues with the way your server is configured. I would definitely look investigate it further.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Diagnostics - 350 Critical errors? But I used rel-canonical links
Hello Mozzers, We launched a new website on Monday and had our first MOZ crawl on 01/07/15 which came back with 350+ critical errors. The majority of these were for duplicate content. We had a situation like this for each gym class: GLOBAL YOGA CLASS (canonical link / master record) YOGA CLASS BROMLEY YOGA CLASS OXFORD YOGA CLASS GLASGOW etc All of these local Yoga pages had the canonical link deployed. So why is this regarded as an error by MOZ? Should I have added robots NO INDEX instead? Would think help? Very scared our rankings are gonna get effected 😞 Ben
Moz Pro | | Bendall0 -
What is the best approach to handling 404 errors?
Hello All - I'm a new here and working on the SEO on my site www.shoottokyo.com. When I am finding 4xx (Client Errors) what is the best way to deal with them? I am finding an error like this for example: http://shoottokyo.com/2010/11/28/technology-and-karma/ This may have been caused when I updated my permalinks from shoottokyo.com/2011/09/postname to shoottokyo.com/postname. I was using the plug in Permalinks moved permanently to fix them. Sometimes I am able to find http://shoottokyo.com/a-very-long-week/www.newscafe.jp and I can tell that I simply have a bad link to News Cafe and I can go to the post and correct it but in the case of the first one I can't find out where the crawler even found the problem. I'm using Wordpress. Is it best to just use a plugin like 'Redirection' to move the rest that have errors where I cannot find the source of the issue? Thanks Dave
Moz Pro | | ShootTokyo0 -
Increase of 404 error after change of encoding
Hello, We just have launch a new version of our website with a new utf-8 encoding. Thing is, we use comma as a separator and since the new website went live, I have a massive increase of 404 error of comma-encoded URL. Here is an example : http://web.bons-de-reduction.com/annuaire%2C321-sticker%2Csite%2Cpromotions%2C5941.html instead of : http://web.bons-de-reduction.com/annuaire,321-sticker,site,promotions,5941.html I check with Screaming Frog SEO and Xenu, I can't manage to find any encoded URL. Is anyone have a clue on how to fix that ? Thanks
Moz Pro | | RetailMeNotFr0 -
How can i locate the links on my site that are causing 404 errors?
In the 404 error report, there is no way to find what page on my site has the broken link.. how can i find them??? help. Matt
Moz Pro | | seo4anyone0 -
What web page and domain analysis / error checking / testing tools do you use for competitor analysis? sites like webpagetest
Just wondering what everyone is using, I am looking to get as much insight and detail as I can on websites that are not currently being monitored by me... i.e. potential clients. I use tools like pagespeed, webpagetest, loadimpact and open site explorer, google adwords, ispionage, alexa, semrush and well, looking for more. I really just want to rip a website to the tiniest pieces possible in an organized and coherent manner... is there anything out there? I have tried several other's which i no longer use (compete, 4q, woopra, nuestar, to name a few), I am not sure if I know exactly what i want, i just want more.... damn the human condition. lol
Moz Pro | | atb9900 -
Keyword Difficulty Tool: Error
Hi - is anyone else getting an error using the Keyword Difficulty tool? I'm getting "ERROR: There was a transient error with your request. Please try again."
Moz Pro | | ErikDster0 -
Error 403
I'm getting this message "We were unable to grade that page. We received a response code of 403. URL content not parseable" when using the On-Page Report Card. Does anyone know how to go about fixing this? I feel like I've tried everything.
Moz Pro | | Sean_McDonnell0 -
Crawl Diagnostics bringing 20k+ errors as duplicate content due to session ids
Signed up to the trial version of Seomoz today just to check it out as I have decided I'm going to do my own SEO rather than outsource it (been let down a few times!). So far I like the look of things and have a feeling I am going to learn a lot and get results. However I have just stumbled on something. After Seomoz dones it's crawl diagnostics run on the site (www.deviltronics.com) it is showing 20,000+ plus errors. From what I can see almost 99% of this is being picked up as erros for duplicate content due to session id's, so i am not sure what to do! I have done a "site:www.deviltronics.com" on google and this certainly doesn't pick up the session id's/duplicate content. So could this just be an issue with the Seomoz bot. If so how can I get Seomoz to ignore these on the crawl? Can I get my developer to add some code somewhere. Help will be much appreciated. Asif
Moz Pro | | blagger0