Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can an incorrect 301 redirect or .htaccess code cause 500 errors?
-
Google Webmaster Tools is showing the following message:
_Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. _
Before I contact the person who manages the server and hosting (essentially asking if the error is on his end) is there a chance I could have created an issue with an incorrect 301 redirect or other code added to .htaccess incorrectly?
Here is the 301 redirect code I am using in .htaccess:
RewriteEngine On
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/.]+/)*(index.html|default.asp)\ HTTP/
RewriteRule ^(([^/.]+/)*)(index|default) http://www.example.com/$1 [R=301,L]
RewriteCond %{HTTP_HOST} !^(www.example.com)?$ [NC]
RewriteRule (.*) http://www.example.com/$1 [R=301,L]
Could adding the following code after that in the .htaccess potentially cause any issues?
BEGIN EXPIRES
<ifmodule mod_expires.c="">ExpiresActive On
ExpiresDefault "access plus 10 days"
ExpiresByType text/css "access plus 1 week"
ExpiresByType text/plain "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 week"
ExpiresByType application/x-icon "access plus 1 year"</ifmodule>END EXPIRES
(Edit) I'd like to add that there is a Wordpress blog on the site too at www.example.com/blog with the following code in it's .htaccess:
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule>END WordPress
Thanks
-
Just to follow up on your last question about 404s, Kim...
No, having a bunch of 404s like that will be no more work for the server than if they were landing on actual blog pages - in fact somewhat less work as the 404 page generally has less content and far fewer database calls.
Also, a page timing out due to server load (server working too hard) doesn't generally result in a 500 error, it just returns a timed-out error. 500 errors are delivered when something actually breaks the server's ability to deliver the correct page content.
Paul
-
Wow, you are very quickly and easily making me much better at what I do:) Thanks for that.
I actually just updated the code a couple days ago by adding the Expires code and fixing the redirect. Maybe the previous double 301 redirect could be the culprit? Or - something I mentioned in another question - there were a ton of 404s because of a blog that wasn't redirected to the /blog subdirectory correctly, which I fixed recently. Could something like that cause the server to work to hard and return a 500 server error?
I'll definitely check out the logs and Pingdom.
Great information and advice.
-
Sorry - and to be clear about your htaccess testing question - no there's no "tool" I've ever heard of. You test it by doing exactly as you've done - ensuring that pages respond correctly and with correct headers. Then you implement a monitoring system to ensure that you know every time that correct behaviour fails. That way you can get the site back up quickly, and have a record of when & how often it happened so you can properly troubleshoot if you have an issue.
Three troubleshooting steps
- become aware as soon as there is a problem
- fix the problem asap to minimize impact on users
- investigate and fix the root cause so it doesn't happen again.
All of these steps depend on a monitoring/alerting system, otherwise you'll always be behind the curve and/or working in the dark.
Hope that helps?
Paul
-
Great answer Paul.
-
As far as I understand, Kimberly, you've only changed the htaccess in the last day or 2? in which case the server error would have been from before your updates.
As far as monitoring - you can check the server error logs (via FTP or in cPanel if that's what the hosting account uses) to check for frequent 500-level server errors.
In addition, I strongly recommend that all commercial sites must have uptime monitoring in place. I like to use Pingdom's paid basic plan which allows monitoring of up to 10 pages. I then select a number of relevant pages and set the tool to test each page, and to check for an actual text snippet on each page (using their custom settings). I monitor things like the home page, the blog home page, a blog post, a blog category page, and critical call-to-action pages. Basically different types of page templates that might respond differently to server issues. plus critical money-making pages.
This way, Pingdom will alert you immediately any time those pages don't respond normally (like when a server gives back a 500 error, or the server goes unresponsive due to overload etc). Monitoring these pages every minute is the ONLY way to really know whether your server and website software are performing properly and consistently. This is a critical component of any professionally run website, in my opinion.
Often Pingdom confirms that things are running fine, but I literally can't count the number of times I've instituting uptime monitoring for new clients, only to find the site has huge downtime no one was really aware of, because they just aren't on their own site often enough to know when it's down. (And you certainly shouldn't be relying on customers to inform you the site has issues. By then it's FAR too late.)
Paul
P.S There are certainly other uptime monitoring systems out there, some are even free. I recommend Pingdom because I've used it for years and it's been consistently excellent. Also, it allows for per-minute checks instead of every 5 minutes, and can check for actual page content, not just server response. In addition, when it finds an outage, it runs a root cause analysis. So it would actually tell you that a 500 error caused the check failure (as opposed to server timing out, which is a different problem). No other affiliation.
-
Paul - Thanks for a new way to check and understand all this.
So, if I was able to visit the page just fine normally, and after setting the user agent to Googlebot, then I should be good? I never saw a 500 server error while visiting the page, just in Webmaster Tools. It was dated 2 days ago, but there have been other server error warnings over the past month or two in GWT, so maybe it is a resolved issue.
Can you suggest a method to confirm the overall proper functioning of the .htaccess code? Is there a tool you use to validate your .htaccess code? I checked response headers in Firebug and found all 200 OKs and 304s for images (from the expires header I assume) so to my amateur viewpoint, it looks good. I just don't want to tank the site unwittingly. Obviously not.
-
To note, Kimberly - Webmaster tools keeps a historical record of issues. It may be showing you server error that occurred in the past, but is no longer a problem. Easiest way is to test the URL it is reporting as having problems.
First test by visiting the URL using a regular browser. Then revisit using a regular browser, but with the user-agent set to imitate the Googlebot crawler since it's Googlebot that's reporting the error. (You can do this using the Set User Agent tool built into the Moz Firefox toolbar, or others. It's a critical capability to have for many purposes.) It's possible for the Googlebot to have issues even if a regular visitor sees none, so you want to test for both.
Assuming these tests return the 500 server error, just briefly rename the pertinent htaccess file for a minute, then go back and rerun the tests. If the error goes away with the htaccess disabled, you know the source of the problem lies in the htaccess rules. If the problem persists, you can be pretty certain it's not the htaccess causing it.
Make sense?
Paul
-
Kimberly,
It can, but without which 5XX it is, it is harder to diagnose. (Is it an endless loop, or something else)
I would suggest (based on you trying to redirect what appears to be homepage whether or not the request is for asp or html) this help from Apache. It is a bit deep, but you appear to want to do it yourself and this is a resource I would suggest.
If you look about a third down the page there is a content box that covers tons of variables.
Best,
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using a Reverse Proxy and 301 redirect to appear Sub Domain as Sub Directory - what are the SEO Risks?
We’re in process to move WordPress blog URLs from subdomains to sub-directory. We aren’t moving blog physically, but using reverse proxy and 301 redirection to do this. Blog subdomain URL is https://blog.example.com/ and destination sub-directory URL is https://www.example.com/blog/ Our main website is e-commerce marketplace which is YMYL site. This is on Windows server. Due to technical reasons, we can’t physically move our WordPress blog to the main website. Following is our Technical Setup Setup a reverse proxy at https://www.example.com/blog/ pointing to https://blog.example.com/ Use a 301 redirection from https://blog.example.com/ to https://www.example.com/blog/ with an exception if a traffic is coming from main WWW domain then it won’t redirect. Thus, we can eliminate infinite loop. Change all absolute URLs to relative URLs on blog Change the sitemap URL from https://blog.example.com/sitemap.xml to https://www.example.com/blog/sitemap.xml and update all URLs mentioned within the sitemap. SEO Risk Evaluation We have individual GA Tracking ID and individual Google Search Console Properties for main website and blog. We will not merge them. Keep them separate as they are. Keeping this in mind, I am evaluating SEO Risks factors Right now when we receive traffic from main website to blog (or vice versa) then it is considered as referral traffic and new cookies are set for Google Analytics. What’s going to happen when its on the same domain? Which type of settings change should I do in Blog’s Google Search Console? (A). Do I need to request “Change of Address” in the Blog’s search console property? (B). Should I re-submit the sitemap? Do I need to re-submit the blog sitemap from the https://www.example.com/ Google Search Console Property? Main website is e-commerce marketplace which is YMYL website, and blog is all about content. So does that impact SEO? Will this dilute SEO link juice or impact on the main website ranking because following are the key SEO Metrices. (A). Main website’s Avg Session Duration is about 10 minutes and bounce rate is around 30% (B). Blog’s Avg Session Duration is 33 seconds and bounce rate is over 92%
Intermediate & Advanced SEO | | joshibhargav_200 -
Default Wordpress 301 Redirects of JS and CSS files. Bad for SEO & How to Fix?
Hi there: We are developers with some digital marketing expertise, but a current issue has us perplexed. An outside SEO firm has asked us to clean up a large number of 301 redirects. Most of these are 'default' Wordpress behavior that relate to calling the latest version of a JS or CSS file. For instance, a JS file is called with this: https://websitexyz.com/wp-includes/js/wp-embed.min.js?ver=4.9.1 but ultimately redirects to this: https://websitexyz.com/wp-includes/js/wp-embed.min.js. We are being asked to prevent the redirect from happening by, presumably, calling the ultimate file to begin with. The issue is that, as far as we know, there's no easy way to alter WP behavior to call the ultimate file to begin with. Does anyone have any thoughts on this? Thanks.
Intermediate & Advanced SEO | | Daaveey0 -
How to handle potentially thousands (50k+) of 301 redirects following a major site replacement
We are looking for the very best way of handling potentially thousands (50k+) of 301 redirects following
Intermediate & Advanced SEO | | GeezerG
a major site replacement and I mean total replacement. Things you should know
Existing domain has 17 years history with Google but rankings have suffered over the past year and yes we know why. (and the bitch is we paid a good sized SEO company for that ineffective and destructive work)
The URL structure of the new site is completely different and SEO friendly URL's rule. This means that there will be many thousands of historical URL's (mainly dynamic ones) that will attract 404 errors as they will not exist anymore. Most are product profile pages and the God Google has indexed them all. There are also many links to them out there.
The new site is fully SEO optimised and is passing all tests so far - however there is a way to go yet. So here are my thoughts on the possible ways of meeting our need,
1: Create 301 redirects for each an every page in the .htaccess file that would be one huge .htaccess file 50,000 lines plus - I am worried about effect on site speed.
2: Create 301 redirects for each and every unused folder, and wildcard the file names, this would be a single redirect for each file in each folder to a single redirect page
so the 404 issue is overcome but the user doesn't open the precise page they are after.
3: Write some code to create a hard copy 301 index.php file for each and every folder that is to be replaced.
4: Write code to create a hard copy 301 .php file for each and every page that is to be replaced.
5: We could just let the pages all die and list them with Google to advise of their death.
6: We could have the redirect managed by a database rather than .htaccess or single redirect files. Probably the most challenging thing will be to load the data in the first place, but I assume this could be done programatically - especially if the new URL can be inferred from the old. Many be I am missing another, simpler approach - please discuss0 -
Moving html site to wordpress and 301 redirect from index.htm to index.php or just www.example.com
I found page duplicate content when using Moz crawl tool, see below. http://www.example.com
Intermediate & Advanced SEO | | gozmoz
Page Authority 40
Linking Root Domains 31
External Link Count 138
Internal Link Count 18
Status Code 200
1 duplicate http://www.example.com/index.htm
Page Authority 19
Linking Root Domains 1
External Link Count 0
Internal Link Count 15
Status Code 200
1 duplicate I have recently transfered my old html site to wordpress.
To keep the urls the same I am using a plugin which appends .htm at the end of each page. My old site home page was index.htm. I have created index.htm in wordpress as well but now there is a conflict of duplicate content. I am using latest post as my home page which is index.php Question 1.
Should I also use redirect 301 im htaccess file to transfer index.htm page authority (19) to www.example.com If yes, do I use
Redirect 301 /index.htm http://www.example.com/index.php
or
Redirect 301 /index.htm http://www.example.com Question 2
Should I change my "Home" menu link to http://www.example.com instead of http://www.example.com/index.htm that would fix the duplicate content, as indx.htm does not exist anymore. Is there a better option? Thanks0 -
SEO impact difference between a URL Rewrite and 301 redirect
Hi guys and girls! Just putting a new site live, we changed the URL from one thing to another and I created a 301 file redirecting the urls like for like. The developer installing it has created a different file with columns like: RewriteRule ^page/ http://www.site/page [R=301,L] RewriteRule ^/page/ http://www.site/page [R=301,L] What's the difference? The page redirects but is there a difference between the 301 redirect and this URL rewrite in terms of SEO and link value?
Intermediate & Advanced SEO | | shloy23-2945840 -
What are partial urls and why this is causing a sitemap error?
Hi mozzers, I have a client that recorded 7 errors when generating Xml sitemap. One of the errors appear to be coming from partial urls and apparently I would need to exclude them from sitemap. What are they exactly and why would they cause an error in the sitemap. Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Can you redirect specific sub domain URLs?
ello! We host our PDFs, Images, CSS all in a sub domain. For the question, let's call this sub.cyto.com. I've noticed a particular PDF doing really well, infact it has gathered valuable external links from high authoritative sites. To top it off, it gets good visits. I've been going back and forth with our developers to move this PDF to a subfolder structure.
Intermediate & Advanced SEO | | Bio-RadAbs
For example: www.cyto.com/document/xxxx.pdf In my perspective, if I move this and set up a permanent redirect, then all the external links the PDF gathered, link juice and future visits will be attributed to the main website. Since the PDF is existing in the subdomain, I can't even track direct visits nor get the link juice. It appears in top position of Google as well. My developer says it is better to keep images, pdf, css in the subdomain. I see his point and an idea I have is to: convert the pdf to a webpage. Set up a 301 redirect from the existing subdomain to this webpage Upload the pdf with a new name and link to it from the webpage, so users can download if they choose to. This should give me the existing rank juice. However, my question is whether you can set up a 301 redirect for just a single subdomain URL to a folder structure URL? sub.cyto.com/xxx.pdf to www.cyto.com/document/xxxx.pdf?0 -
Is it safe to 301 redirect old domain to new domain after a manual unnatural links penalty?
I have recently taken on a client that has been manually penalised for spammy link building by two previous SEOs. Having just read this excellent discussion, http://www.seomoz.org/blog/lifting-a-manual-penalty-given-by-google-personal-experience I am weighing up the odds of whether it's better to cut losses and recommend moving domains. I had thought under these circumstances it was important not to 301 the old domain to the new domain but the author (Lewis Sellers) comments on 3/4/13 that he is aware of forwards having been implemented without transferring the penalty to the new domain. http://www.seomoz.org/blog/lifting-a-manual-penalty-given-by-google-personal-experience#jtc216689 Is it safe to 301? What's the latest thinking?
Intermediate & Advanced SEO | | Ewan.Kennedy0