Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can an incorrect 301 redirect or .htaccess code cause 500 errors?
-
Google Webmaster Tools is showing the following message:
_Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. _
Before I contact the person who manages the server and hosting (essentially asking if the error is on his end) is there a chance I could have created an issue with an incorrect 301 redirect or other code added to .htaccess incorrectly?
Here is the 301 redirect code I am using in .htaccess:
RewriteEngine On
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/.]+/)*(index.html|default.asp)\ HTTP/
RewriteRule ^(([^/.]+/)*)(index|default) http://www.example.com/$1 [R=301,L]
RewriteCond %{HTTP_HOST} !^(www.example.com)?$ [NC]
RewriteRule (.*) http://www.example.com/$1 [R=301,L]
Could adding the following code after that in the .htaccess potentially cause any issues?
BEGIN EXPIRES
<ifmodule mod_expires.c="">ExpiresActive On
ExpiresDefault "access plus 10 days"
ExpiresByType text/css "access plus 1 week"
ExpiresByType text/plain "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 week"
ExpiresByType application/x-icon "access plus 1 year"</ifmodule>END EXPIRES
(Edit) I'd like to add that there is a Wordpress blog on the site too at www.example.com/blog with the following code in it's .htaccess:
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule>END WordPress
Thanks
-
Just to follow up on your last question about 404s, Kim...
No, having a bunch of 404s like that will be no more work for the server than if they were landing on actual blog pages - in fact somewhat less work as the 404 page generally has less content and far fewer database calls.
Also, a page timing out due to server load (server working too hard) doesn't generally result in a 500 error, it just returns a timed-out error. 500 errors are delivered when something actually breaks the server's ability to deliver the correct page content.
Paul
-
Wow, you are very quickly and easily making me much better at what I do:) Thanks for that.
I actually just updated the code a couple days ago by adding the Expires code and fixing the redirect. Maybe the previous double 301 redirect could be the culprit? Or - something I mentioned in another question - there were a ton of 404s because of a blog that wasn't redirected to the /blog subdirectory correctly, which I fixed recently. Could something like that cause the server to work to hard and return a 500 server error?
I'll definitely check out the logs and Pingdom.
Great information and advice.
-
Sorry - and to be clear about your htaccess testing question - no there's no "tool" I've ever heard of. You test it by doing exactly as you've done - ensuring that pages respond correctly and with correct headers. Then you implement a monitoring system to ensure that you know every time that correct behaviour fails. That way you can get the site back up quickly, and have a record of when & how often it happened so you can properly troubleshoot if you have an issue.
Three troubleshooting steps
- become aware as soon as there is a problem
- fix the problem asap to minimize impact on users
- investigate and fix the root cause so it doesn't happen again.
All of these steps depend on a monitoring/alerting system, otherwise you'll always be behind the curve and/or working in the dark.
Hope that helps?
Paul
-
Great answer Paul.
-
As far as I understand, Kimberly, you've only changed the htaccess in the last day or 2? in which case the server error would have been from before your updates.
As far as monitoring - you can check the server error logs (via FTP or in cPanel if that's what the hosting account uses) to check for frequent 500-level server errors.
In addition, I strongly recommend that all commercial sites must have uptime monitoring in place. I like to use Pingdom's paid basic plan which allows monitoring of up to 10 pages. I then select a number of relevant pages and set the tool to test each page, and to check for an actual text snippet on each page (using their custom settings). I monitor things like the home page, the blog home page, a blog post, a blog category page, and critical call-to-action pages. Basically different types of page templates that might respond differently to server issues. plus critical money-making pages.
This way, Pingdom will alert you immediately any time those pages don't respond normally (like when a server gives back a 500 error, or the server goes unresponsive due to overload etc). Monitoring these pages every minute is the ONLY way to really know whether your server and website software are performing properly and consistently. This is a critical component of any professionally run website, in my opinion.
Often Pingdom confirms that things are running fine, but I literally can't count the number of times I've instituting uptime monitoring for new clients, only to find the site has huge downtime no one was really aware of, because they just aren't on their own site often enough to know when it's down. (And you certainly shouldn't be relying on customers to inform you the site has issues. By then it's FAR too late.)
Paul
P.S There are certainly other uptime monitoring systems out there, some are even free. I recommend Pingdom because I've used it for years and it's been consistently excellent. Also, it allows for per-minute checks instead of every 5 minutes, and can check for actual page content, not just server response. In addition, when it finds an outage, it runs a root cause analysis. So it would actually tell you that a 500 error caused the check failure (as opposed to server timing out, which is a different problem). No other affiliation.
-
Paul - Thanks for a new way to check and understand all this.
So, if I was able to visit the page just fine normally, and after setting the user agent to Googlebot, then I should be good? I never saw a 500 server error while visiting the page, just in Webmaster Tools. It was dated 2 days ago, but there have been other server error warnings over the past month or two in GWT, so maybe it is a resolved issue.
Can you suggest a method to confirm the overall proper functioning of the .htaccess code? Is there a tool you use to validate your .htaccess code? I checked response headers in Firebug and found all 200 OKs and 304s for images (from the expires header I assume) so to my amateur viewpoint, it looks good. I just don't want to tank the site unwittingly. Obviously not.
-
To note, Kimberly - Webmaster tools keeps a historical record of issues. It may be showing you server error that occurred in the past, but is no longer a problem. Easiest way is to test the URL it is reporting as having problems.
First test by visiting the URL using a regular browser. Then revisit using a regular browser, but with the user-agent set to imitate the Googlebot crawler since it's Googlebot that's reporting the error. (You can do this using the Set User Agent tool built into the Moz Firefox toolbar, or others. It's a critical capability to have for many purposes.) It's possible for the Googlebot to have issues even if a regular visitor sees none, so you want to test for both.
Assuming these tests return the 500 server error, just briefly rename the pertinent htaccess file for a minute, then go back and rerun the tests. If the error goes away with the htaccess disabled, you know the source of the problem lies in the htaccess rules. If the problem persists, you can be pretty certain it's not the htaccess causing it.
Make sense?
Paul
-
Kimberly,
It can, but without which 5XX it is, it is harder to diagnose. (Is it an endless loop, or something else)
I would suggest (based on you trying to redirect what appears to be homepage whether or not the request is for asp or html) this help from Apache. It is a bit deep, but you appear to want to do it yourself and this is a resource I would suggest.
If you look about a third down the page there is a content box that covers tons of variables.
Best,
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirects to relative URLs not absolute a problem?
Hi we recently did a migration and a lot of content changed locations see: https://d.pr/i/RvqI81 Basically, the 301 goes to the correct location but its a relative URL (as you can see from the screenshot) rather than absolute URL. Do you think this is a high priority issue from an SEO standpoint, should we get the developer to change the redirects to absolute? Cheers.
Intermediate & Advanced SEO | | cathywix0 -
Htaccess - Redirecting TAG or Category pages
Hello Fellow Moz's, We have an issue redirecting some /TAG and /Category pages to inner pages. As an example we use: RedirectMatch 301 /category/Sample-Category(.*) https://OurDomain.com.au/New-Page//$1 That works well. The issue is we have other categories and tags that are named similar to /Sample-Category As an example, if we try to redirect /Sample-Category-1 to /New-Page-1 - it will not work, and redirects to /New-Page I assume this is because /Sample-Category is already being redirected, so anything after /Sample-Category like -1 or -2 or -3 etc, will not be recognized. Anyone know of a workaround?
Intermediate & Advanced SEO | | Jes-Extender-Australia0 -
If my website uses CDN does thousands of 301 redirect can harm the website performance?
Hi, If my website uses CDN does thousands of 301 redirect can harm the website performance? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Will 301 Redirects Slow Page Speed?
We have a lot of subdomains that we are switching to subfolders and need to 301 redirect all the pages from those subdomains to the new URL. We have over 1000 that need to be implemented. So, will 301 redirects slow the page speed regardless of which URL the user comes through? Or, as the old urls are dropped from Google's index and bypassed as the new URLs take over in the SERPs, will those redirects then have no effect on page speed? Trying to find a clear answer to this and have yet to find a good answer
Intermediate & Advanced SEO | | MJTrevens0 -
New Site (redesign) Launched Without 301 Redirects to New Pages - Too Late to Add Redirects?
We recently launched a redesign/redevelopment of a site but failed to put 301 redirects in place for the old URL's. It's been about 2 months. Is it too late to even bother worrying about it at this point? The site has seen a notable decrease in site traffic/visits, perhaps due to this issue. I assume that once the search engines get an error on a URL, it will remove it from displaying in search results after a period of time. I'm just not sure if they will try to re-crawl those old URLs at some point and if so, it may be worth it to have those 301 redirects in place. Thank you.
Intermediate & Advanced SEO | | BrandBuilder0 -
Should we 301 redirect old events pages on a website?
We have a client that has an events category section that is filled to the brim with past events webpages. Another issue is that these old events webpages all contain duplicate meta description tags, so we are concerned that Google might be penalizing our client's website for this issue. Our client does not want to create specialized meta description tags for these old events pages. Would it be a good idea to 301 redirect these old events landing pages to the main events category page to pass off link equity & remove the duplicate meta description tag issue? This seems drastic (we even noticed that searchmarketingexpo.com is keeping their old events pages). However it seems like these old events webpages offer little value to our website visitors. Any feedback would be much appreciated.
Intermediate & Advanced SEO | | RosemaryB0 -
301 redirection pointing to noindexed pages
I have rather an unusual situation where a recently launched affiliate site does not have any unique content as its all syndicated content. For that reason we are currently using the noindex,nofollow meta tags to keep the pages out of the search engines index until we create unique content for the pages. The problem is that due to a very tight timeframe with rebranding, we are looking at 301 redirecting (on a page to page basis) another high authority legacy domain to this new site before we have had a chance to add unique content to it and remove the noindex,nofollow tags. I would assume that any link authority normally passed through the 301 would be lost in this scenario but Im uncertain of what the broader impact might be. Has anyone dealt with a similar scenario? I know this scenario is not ideal and I would rather wait until the unique content is up and noindex tags are removed before launching the 301 redirect of the legacy domain but there are a number of competing priorities at play outside of SEO.
Intermediate & Advanced SEO | | LosNomads0 -
Can penalties be passed via 301 redirect?
I have a well established domain that's been hit with some penalties. It hasn't been nuked off the map, just downgraded, especially on short-tail, one word type queries. I'm planning on redirecting this domain to another well established domain. The domains already have a history of lots of interlinking and are very similar from a subject matter standpoint. I feel that the penalized domain has been hit with an "over-optimization" of link anchor text penalty (I'm hoping it's algorithmic, but it could be manual). My question is if anyone has ever heard of a penalty like this being transferred to another domain through a 301 redirect. My hope is that the penalty just puts a cap on how much juice the redirect can pass, rather than transferring the penalty to the other domain itself. Any thoughts on this?
Intermediate & Advanced SEO | | SEOMG1