500 errors
-
We are accumulating a signficant number of 500 errors, now reaching 3000 URLs after only 2 months since the re-coding of our site in Expression Engine.
I haven't gotten a straight answer re: implications or solutions... by default, suggesting that it's not of any consequence.
History: The site was initially developed in EE (prior to that, an HTML platform) with a host of site issues. We then contracted an EE specialist to properly code the site. The 'new; site was released Sept 21st.
I'd appreciate some guidance and recommendations, so I can go back in hand to the developer.
What are the considerations or consequences, if any, for ignoring the 500 errors?
What are strategies or solutions for removing them from Google Webmaster Tools and preventing future 500 errors?
Thanks.
Alan
-
Hi Alan,
500 errors are a general "not found" web error. They don't really tell you anything about what's going on the page except that there's an error. It's kind of like if you went to the doctor and said "I'm sick" and didn't give any other details.
In order to help, can you either add your site details in this form or if you can't do that, you can email me erica@seomoz.org and I'll take a look and see if I can figure it out. (Or find someone else on staff with some time to take a look.)
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are these Search Console crawl errors a major concern to new client site?
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors: 111 Server Errors (photos) 104 Soft 404s (blogs, archives, tags) 6,229 Not Found (listings) I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog. For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me. When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now. I'd appreciate your insights Mozzers!
Reporting & Analytics | | localwork0 -
New website server code errors
I launched a new website at www.cheaptubes.com and have recovered my search engine rankings as well after penguin & panda devestation. I'm continuing to improve the site but moz analytics is saying I have 288 medium issues and i see the warning "45% of site pages served 302 redirects during the last crawl". I'm not sure how to fix this. I'm on WP using Yoast SEO so all the 301's I did are 301's not 302's. I do have SSL, could it be Http vs Https?
Reporting & Analytics | | cheaptubes0 -
Crawl errors for pages that no longer exist
Hey folks, I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense. The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list. Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about? Thanks!
Reporting & Analytics | | BrianAlpert780 -
Suspect Links from Yeusaigon.net Causing Server Errors
Good morning, Webmaster Tools is reporting an increase in server errors on our site due to some very suspect links from Yeusaigon.net. After taking a quick look, it appears they are some form of search engine attempting to link to our images by using incomplete URLs. For example: http://yeusaigon.net/search/images.php?q=htc%20one%20max%20phone%20cases&page=1044 Is linking to: http://www.mobilemadhouse.co.uk/caseflex-htc-one-max-real-leather-flip... As this URL is incomplete, it's throwing up a server error. There are currently 139 instances of there errors from the same domain, and is increasing by around 5-10 per day. The domain, however, is linking to some of our pages/images correctly, but I fear Google may look at these as spammy links - they certainly look that way! So, what can we do? I can't find any contact details on Yeusaigon website so I have disavowed the entire domain. Is this the right thing to do? How do I stop the ever-increasing number of sever errors due to incorrect URLs? Cheers, Lewis
Reporting & Analytics | | PeaSoupDigital0 -
Moz is showing different "errors" than Webmaster tools
I have set up my Moz campaign and the crawl errors are showing multiple duplicate content and page titles however when I check my webmaster tools data, these errors are not showing up. Is this normal and who should I listen to?
Reporting & Analytics | | LabelMedia0 -
SEO Moz Errors
We have SEO Moz Errors and warnings showing up, yet we have cleaned them
Reporting & Analytics | | RNK
up. The same errors were showing up in Google's Webmaster tools but after we corrected them they do not show up as crawl errors in Webmaster tools.
Why is SEO Moz different and why does it continue to show corrections already made.0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
500 errors and impact on google rankings
Since the launch of our newly designed website about 6 months ago, we are experiencing a high number of 500 server errors (>2000). Attempts to resolve these errors have been unsuccessful to date. We have just started to notice a consistent and sustained drop in rankings despite our hard sought efforts to correct. Two questions... can very high levels of 500 errors adversely effect our google rankings? And, if this is the case, what type of specialist (what are they called) has expertise to investigate and fix this issue. I should also mention that the sitemap also goes down on a regular basis, which some have stated is due to the size of the site (>500 pages). Don't know if they're part of the same problem? Thanks.
Reporting & Analytics | | ahw0