Crawl Diagnostics Report 500 erorr
-
How can I know what is causing my website to have 500 errors and how I locate it and fix it?
-
500 errors could be caused by a mulitude of reasons, and for the non-technical they can be very hard to track down and fix.
The first thing I would look at is if it's a repeating problem in Google Webmasters Tools, or a one-time issue. These errors will show up in GWT for a long time - but if it's not a repeating problem it probably is nothing you need to worry about.
Wait, I assumed you found the problems in GWT, when you may have possibly found them using the SEOmoz crawl report. Either way, you should probably log into Google Webmaster Crawl Errors report and see if Google is experiencing the same problems.
Sometimes 500 errors are caused by over-aggressive robots and/or improperly configured servers that can't handle the load. In this case, a simple crawl delay directive in your robots.txt file may do the trick. It would look something like this:
User-agent: * Crawl-delay: 5
This would request that robots wait at least 5 seconds between page requests. But note, this doesn't necessarly solve the problem of why your server was returning 500s in the first place.
You may need to consult your hosting provider for advice. For example, Bluehost has this excellent article on dealing with 500 errors from their servers: https://my.bluehost.com/cgi/help/594
Hope this helps! Best of luck with your SEO.
-
Thank you Corey for your advise, I see which links it is in google webmasters and in , but I can't reproduce it and don't know whats the best way to fix it?
-
Thomas thank you so much for your advise, and Keri thanks for offering help.
My problem is that when I try to reproduce the 500 error so the host cant help me on how to fix it.
Any help?
-
Hey Keri how are you merry Christmas, I believe that 500 errors are almost always server related errors and unless he tells me about the host or Some other maybe strange unique problem with the computers registry I don't have enough to go on. You be interesting to find out what it is all the best, Tom
-
Hi Yoseph,
Did you get this figured out, or would you still like some assistance?
-
HTTP Error 500 is an Internal Server Error. It's a server-side error, that means there's either a problem with your web server or the code that it's trying to interpret. It may not happen in 100% of scenarios, so you may not always see it happening yourself, but it prevents the page from loading. Obviously, that's bad for search engines and users.
Your best bet in tracking down this error would be to go through your web server's error logs. Or, if you can replicate this happening on the web, you could enable error reporting, and see what errors pop up there. That should tell you how to fix the issue, whatever it may be.
-
I have googled it for you and I definitely think you should contact your web host. Here's what comes up https://my.bluehost.com/cgi/help/594
-
go into the campaign section on seomoz run your site through it. You will then see where the errors are upon seeing error lit up click it use the drop-down to select 500 errors then you will see exactly what link is causing the error.
There is literally no way I can guess what is causing your website guess not to work correctly however a 500 error is a very serious one most likely involving a problem with server.
If you give me your domain I might be able to help more however if your site is just giving 500 errors you might want to call your web host as it sounds like it is not an SEO problem is much as it is a hosting issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Increase in Crawl Errors
I had a problem with a lot of crawl errors (on Google Search Console) a while back, due to the removal of a shopping cart. I thought I'd dealt with this & Google seemed to agree (see attached pic), but now they're all back with a vengeance! The crawl errors are all the old shop pages that I thought I'd made clear weren't there anymore. The sitemaps (using Yoast on Wordpress to generate these) all updated 16 Aug but the increase didn't happen till 18-20. How do I make it clear to Google that these pages are gone forever? Screen-Shot-2016-08-22-at-10.19.05.png
Technical SEO | | abisti20 -
My wepgages aren't crawled by google
Most of my webpages aren't crawled by google.
Technical SEO | | Poutokas
Why is that and what can i do to make google index at least most of my webpages?0 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
Diagnostics say I'm missing Page titles... but I am not?
I've been running a crawl of one of our new site builds for a couple of weeks. The Diagnostics picked up a couple of issues, which was great, but it's saying we're missing Page Titles and Descriptions on pages that we have Page Titles and Descriptions. Anyone come across this before?
Technical SEO | | niamhomahony0 -
Javascript --can SE crawl?
I have a couple of nested div's. I'd like to do an onclick="location.href='http://www.example.com';" - within the outermost div so that all content within will link to one url. Can the Search Engines crawl this? Thanks!
Technical SEO | | Morris770 -
Is there a reason to set a crawl-delay in the robots.txt?
I've recently encountered a site that has set a crawl-delay command set in their robots.txt file. I've never seen a need for this to be set since you can set that in Google Webmaster Tools for Googlebot. They have this command set for all crawlers, which seems odd to me. What are some reasons that someone would want to set it like that? I can't find any good information on it when researching.
Technical SEO | | MichaelWeisbaum0 -
Linklicious and Crawl rates
Can somebody please explain me what is 'crawl rate' and how does 'linklicious' help us with it? I mean I can always visit the website and know more about it, but I want to understand the concept. Please help.
Technical SEO | | KS__0