Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can an incorrect 301 redirect or .htaccess code cause 500 errors?
-
Google Webmaster Tools is showing the following message:
_Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. _
Before I contact the person who manages the server and hosting (essentially asking if the error is on his end) is there a chance I could have created an issue with an incorrect 301 redirect or other code added to .htaccess incorrectly?
Here is the 301 redirect code I am using in .htaccess:
RewriteEngine On
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/.]+/)*(index.html|default.asp)\ HTTP/
RewriteRule ^(([^/.]+/)*)(index|default) http://www.example.com/$1 [R=301,L]
RewriteCond %{HTTP_HOST} !^(www.example.com)?$ [NC]
RewriteRule (.*) http://www.example.com/$1 [R=301,L]
Could adding the following code after that in the .htaccess potentially cause any issues?
BEGIN EXPIRES
<ifmodule mod_expires.c="">ExpiresActive On
ExpiresDefault "access plus 10 days"
ExpiresByType text/css "access plus 1 week"
ExpiresByType text/plain "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 week"
ExpiresByType application/x-icon "access plus 1 year"</ifmodule>END EXPIRES
(Edit) I'd like to add that there is a Wordpress blog on the site too at www.example.com/blog with the following code in it's .htaccess:
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule>END WordPress
Thanks
-
Just to follow up on your last question about 404s, Kim...
No, having a bunch of 404s like that will be no more work for the server than if they were landing on actual blog pages - in fact somewhat less work as the 404 page generally has less content and far fewer database calls.
Also, a page timing out due to server load (server working too hard) doesn't generally result in a 500 error, it just returns a timed-out error. 500 errors are delivered when something actually breaks the server's ability to deliver the correct page content.
Paul
-
Wow, you are very quickly and easily making me much better at what I do:) Thanks for that.
I actually just updated the code a couple days ago by adding the Expires code and fixing the redirect. Maybe the previous double 301 redirect could be the culprit? Or - something I mentioned in another question - there were a ton of 404s because of a blog that wasn't redirected to the /blog subdirectory correctly, which I fixed recently. Could something like that cause the server to work to hard and return a 500 server error?
I'll definitely check out the logs and Pingdom.
Great information and advice.
-
Sorry - and to be clear about your htaccess testing question - no there's no "tool" I've ever heard of. You test it by doing exactly as you've done - ensuring that pages respond correctly and with correct headers. Then you implement a monitoring system to ensure that you know every time that correct behaviour fails. That way you can get the site back up quickly, and have a record of when & how often it happened so you can properly troubleshoot if you have an issue.
Three troubleshooting steps
- become aware as soon as there is a problem
- fix the problem asap to minimize impact on users
- investigate and fix the root cause so it doesn't happen again.
All of these steps depend on a monitoring/alerting system, otherwise you'll always be behind the curve and/or working in the dark.
Hope that helps?
Paul
-
Great answer Paul.
-
As far as I understand, Kimberly, you've only changed the htaccess in the last day or 2? in which case the server error would have been from before your updates.
As far as monitoring - you can check the server error logs (via FTP or in cPanel if that's what the hosting account uses) to check for frequent 500-level server errors.
In addition, I strongly recommend that all commercial sites must have uptime monitoring in place. I like to use Pingdom's paid basic plan which allows monitoring of up to 10 pages. I then select a number of relevant pages and set the tool to test each page, and to check for an actual text snippet on each page (using their custom settings). I monitor things like the home page, the blog home page, a blog post, a blog category page, and critical call-to-action pages. Basically different types of page templates that might respond differently to server issues. plus critical money-making pages.
This way, Pingdom will alert you immediately any time those pages don't respond normally (like when a server gives back a 500 error, or the server goes unresponsive due to overload etc). Monitoring these pages every minute is the ONLY way to really know whether your server and website software are performing properly and consistently. This is a critical component of any professionally run website, in my opinion.
Often Pingdom confirms that things are running fine, but I literally can't count the number of times I've instituting uptime monitoring for new clients, only to find the site has huge downtime no one was really aware of, because they just aren't on their own site often enough to know when it's down. (And you certainly shouldn't be relying on customers to inform you the site has issues. By then it's FAR too late.)
Paul
P.S There are certainly other uptime monitoring systems out there, some are even free. I recommend Pingdom because I've used it for years and it's been consistently excellent. Also, it allows for per-minute checks instead of every 5 minutes, and can check for actual page content, not just server response. In addition, when it finds an outage, it runs a root cause analysis. So it would actually tell you that a 500 error caused the check failure (as opposed to server timing out, which is a different problem). No other affiliation.
-
Paul - Thanks for a new way to check and understand all this.
So, if I was able to visit the page just fine normally, and after setting the user agent to Googlebot, then I should be good? I never saw a 500 server error while visiting the page, just in Webmaster Tools. It was dated 2 days ago, but there have been other server error warnings over the past month or two in GWT, so maybe it is a resolved issue.
Can you suggest a method to confirm the overall proper functioning of the .htaccess code? Is there a tool you use to validate your .htaccess code? I checked response headers in Firebug and found all 200 OKs and 304s for images (from the expires header I assume) so to my amateur viewpoint, it looks good. I just don't want to tank the site unwittingly. Obviously not.
-
To note, Kimberly - Webmaster tools keeps a historical record of issues. It may be showing you server error that occurred in the past, but is no longer a problem. Easiest way is to test the URL it is reporting as having problems.
First test by visiting the URL using a regular browser. Then revisit using a regular browser, but with the user-agent set to imitate the Googlebot crawler since it's Googlebot that's reporting the error. (You can do this using the Set User Agent tool built into the Moz Firefox toolbar, or others. It's a critical capability to have for many purposes.) It's possible for the Googlebot to have issues even if a regular visitor sees none, so you want to test for both.
Assuming these tests return the 500 server error, just briefly rename the pertinent htaccess file for a minute, then go back and rerun the tests. If the error goes away with the htaccess disabled, you know the source of the problem lies in the htaccess rules. If the problem persists, you can be pretty certain it's not the htaccess causing it.
Make sense?
Paul
-
Kimberly,
It can, but without which 5XX it is, it is harder to diagnose. (Is it an endless loop, or something else)
I would suggest (based on you trying to redirect what appears to be homepage whether or not the request is for asp or html) this help from Apache. It is a bit deep, but you appear to want to do it yourself and this is a resource I would suggest.
If you look about a third down the page there is a content box that covers tons of variables.
Best,
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will 301 Redirects Slow Page Speed?
We have a lot of subdomains that we are switching to subfolders and need to 301 redirect all the pages from those subdomains to the new URL. We have over 1000 that need to be implemented. So, will 301 redirects slow the page speed regardless of which URL the user comes through? Or, as the old urls are dropped from Google's index and bypassed as the new URLs take over in the SERPs, will those redirects then have no effect on page speed? Trying to find a clear answer to this and have yet to find a good answer
Intermediate & Advanced SEO | | MJTrevens0 -
Switching from HTTP to HTTPS: 301 redirect or keep both & rel canonical?
Hey Mozzers, I'll be moving several sites from HTTP to HTTPS in the coming weeks (same brand, multiple ccTLDs). We'll start on a low traffic site and test it for 2-4 weeks to see the impact before rolling out across all 8 sites. Ideally, I'd like to simply 301 redirect the HTTP version page to the HTTPS version of the page (to get that potential SEO rankings boost). However, I'm concerned about the potential drop in rankings, links and traffic. I'm thinking of alternative ways and so instead of the 301 redirect approach, I would keep both sites live and accessible, and then add rel canonical on the HTTPS pages to point towards HTTP so that Google keeps the current pages/ links/ indexed as they are today (in this case, HTTPS is more UX than for SEO). Has anyone tried the rel canonical approach, and if so, what were the results? Do you recommend it? Also, for those who have implemented HTTPS, how long did it take for Google to index those pages over the older HTTP pages?
Intermediate & Advanced SEO | | Steven_Macdonald0 -
Images Returning 404 Error Codes. 301 Redirects?
We're working with a site that has gone through a lot of changes over the years - ownership, complete site redesigns, different platforms, etc. - and we are finding that there are both a lot of pages and individual images that are returning 404 error codes in the Moz crawls. We're doing 301 redirects for the pages, but what would the best course of action be for the images? The images obviously don't exist on the site anymore and are therefore returning the 404 error codes. Should we do a 301 redirect to another similar image that is on the site now or redirect the images to an actual page? Or is there another solution that I'm not considering (besides doing nothing)? We'll go through the site to make sure that there aren't any pages within the site that are still linking to those images, which is probably where the 404 errors are coming from. Based on feedback below it sounds like once we do that, leaving them alone is a good option.
Intermediate & Advanced SEO | | garrettkite0 -
Can a large fluctuation of links cause traffic loss?
I've been asked to look at a site that has lost 70/80% if their search traffic. This happened suddenly around the 17th April. Traffic dropped off over a couple of days and then flat-lined over the next couple of weeks. The screenshot attached, shows the impressions/clicks reported in GWT. When I investigated I found: There had been no changes/updates to the site in question There were no messages in GWT indicating a manual penalty The number of pages indexed shows no significant change There are no particular trends in keywords/queries affected (they all were.) I did discover that ahrefs.com showed that a large number of links were reported lost on the 17th April. (17k links from 1 domain). These links reappeared around the 26th/27th April. But traffic shows no sign of any recovery. The links in question were from a single development server (that shouldn't have been indexed in the first place, but that's another matter.) Is it possible that these links were, maybe artificially, boosting the authority of the affected site? Has the sudden fluctuation in such a large number of links caused the site to trip an algorithmic penalty (penguin?) Without going into too much detail as I'm bound by client confidentiality - The affected site is really a large database and the links pointing to it are generated by a half dozen or so article based sister sites based on how the articles are tagged. The links point to dynamically generated content based on the url. The site does provide a useful/valuable service/purpose - it's not trying to "game the system" in order to rank. That doesn't mean to say that it hasn't been performing better in search than it should have been. This means that the affected site has ~900,000 links pointing to is that are the names of different "entities". Any thoughts/insights would be appreciated. I've expresses a pessimistic outlook to the client, but as you can imaging they are confused and concerned. LVSceCN.png
Intermediate & Advanced SEO | | DougRoberts0 -
Too many 301 redirects?
Hey, My company currently has one chief website with about 500-600 other domains that all feature the same material as the chief website. These domains have been around for about 5 years and have actually picked up some link traffic. I have all of these identical web-pages utilizing rel=canonical but I was wondering if I would be better served, from SEO purposes, to 301 redirect all of these sites to their respective pages on our chief website? If I add 500 301 redirects, will the major search engines consider this to be black-hat link-building even though the sites are related and technically already feature the same content? For an example, the chief website is www.1099pro.com and I would 301 redirect the below sites to the chief site: 1099softwarepro.com 1099softwarepro.info 1099softwarepro.net 1099softwarepro.biz 1099softwareprofessionals.com 1099softwareprofessionals.info ...you get the point
Intermediate & Advanced SEO | | Stew2220 -
Can you redirect specific sub domain URLs?
ello! We host our PDFs, Images, CSS all in a sub domain. For the question, let's call this sub.cyto.com. I've noticed a particular PDF doing really well, infact it has gathered valuable external links from high authoritative sites. To top it off, it gets good visits. I've been going back and forth with our developers to move this PDF to a subfolder structure.
Intermediate & Advanced SEO | | Bio-RadAbs
For example: www.cyto.com/document/xxxx.pdf In my perspective, if I move this and set up a permanent redirect, then all the external links the PDF gathered, link juice and future visits will be attributed to the main website. Since the PDF is existing in the subdomain, I can't even track direct visits nor get the link juice. It appears in top position of Google as well. My developer says it is better to keep images, pdf, css in the subdomain. I see his point and an idea I have is to: convert the pdf to a webpage. Set up a 301 redirect from the existing subdomain to this webpage Upload the pdf with a new name and link to it from the webpage, so users can download if they choose to. This should give me the existing rank juice. However, my question is whether you can set up a 301 redirect for just a single subdomain URL to a folder structure URL? sub.cyto.com/xxx.pdf to www.cyto.com/document/xxxx.pdf?0 -
301 Redirection and apostrophes in URLs
Hi I am experiencing trouble getting any redirects with apostrophes in the URLs to 301 redirect in order to eliminate 404 errors. I have tried replacing the instance of the apostrophe in the source URL field to %27 and variations of this but to no avail. The site is a wordpress site (the old URLS are legacies from the old Business Catalyst site) and I am using the redirection plug in. I have gone into some detail with a helpful soul here http://wordpress.org/support/topic/how-to-deal-with-apostrophes-in-source-url but unfortunately to no result. If anyone has any idea how to solve this puzzle I would be grateful for the help. Example: http://www.tesselaars.com/blog/Inside_Flowers/post/Online_Marketing_for_Florists_Part_1%E2%80%93_A_Website_You_Won%27t_Regret/
Intermediate & Advanced SEO | | Seamoose0 -
Is it ok to use both 301 redirect and rel="canonical' at the same time?
Hi everyone, I'm sorry if this has been asked before. I just wasn't able to find a response in previous questions. To fix the problems in our website regarding duplication I have the possibility to set up 301's and, at the same time, modify our CMS so that it automatically sets a rel="canonical" tag for every page that is generated. Would it be a problem to have both methods set up? Is it a problem to have a on a page that is redirecting to another one? Is it advisable to have a rel="canonical" tag on every single page? Thanks for reading!
Intermediate & Advanced SEO | | SDLOnlineChannel0