Thousands of 301 redirections - .htaccess alternatives?
-
Hi guys,
I just want to ask if there are other possible issues/problems (other than server load) once we implement 301 redirections for 10,000+ URLs using .htaccess. Are there other alternatives?
-
Thank you for your answer ! I will share it with our IT team.
-
Why don't you just have a VPS server with NGINX the stream handler/reverse proxy for your IIS web server?
- https://www.digitalocean.com/community/tutorials/how-to-set-up-nginx-load-balancing
- http://www.iborgelt.com/windows-home-server-behind-nginx-reverse-proxy/
You're just using the VPS as an interface to handle your redirects and for $5 a month. You can't beat it. Im sure if your IT department googles: nginx reverse proxy iis they will get the idea.
-
Hi guys, I have a similar problem, but on IIS7. Our IT department says our 301 redirections file is at it's max size in the webconfig. They could increase the limit, but says it will impact page load speed negatively. What's the impact on page speed of having 5000 to 10000 urls in the rewrite map ?
Also, they're also looking at a solution to look at the redirections only when the site gives a 404, so it would hit 404, then 301, then 200. I am a little scared of this SEO wize. Would it be a problem?
Thanks !
-
Putting aside server load / config issues, and from the pure SEO point of view.
No, you shouldn't have any major issues with that many 301s. However, what you might find is that depending on the size of your site and the frequency of Googlebots visits that some of these pages take a long time (months) to drop out of the index and be replaced by their newer alternatives. This normally isn't cause for alarm.
In some instances you might end up with pages that now have now links to them (as their parent categories were all redirected also) and so seem to get stuck and never get recrawled by Google to update. In a couple of instances I have had success using XML sitemap files that just include these 'blocked' pages (the old URLs still in the index) to prompt Google to recrawl them.
Also there is Google Webmaster Tools feature to 'crawl as Googlebot' which then prompts you to 'submit to index' which you can use to prompt recrawls on a per-page basis (but you have credits here, so should only be for the more important pages).
Best of luck!
-
The main benefit of this would be in reducing server load / response time, and potentially in maintainability of the server config.
The most important aspect of this side of thing would be based on how many separate rules you have in your .htaccess file for those 10,000 redirects.
-
Hi Kevin,
What's the difference of this method to the standard 301 redirection using .htaccess?
-
Do you guys have a step-by-step guide in implementing 301 redirection using this httpd main server config file?
-
Well, if you're on a VPS/Dedicated Machine. - I would take a look at http://httpd.apache.org/docs/current/rewrite/rewritemap.html
RewriteMap has 0 effect on the load time like if you were to have the same in .htaccess it will eat those redirect rules. Remember 301s cache in the browser so when you're testing have them all 302s until you're happy and then watch your rewrite log when you launch. If you need help let us know.
This does take some knowhow and learning but you should be able to get this done in a few days. ( testing, reading documentation )
-
Do you have access to the httpd main server config file? If so, please read Apache HTTP Server Tutorial: .htaccess files.
".htaccess files should be used in a case where the content providers need to make configuration changes to the server on a per-directory basis, but do not have root access on the server system. In the event that the server administrator is not willing to make frequent configuration changes, it might be desirable to permit individual users to make these changes in .htaccess files for themselves. This is particularly true, for example, in cases where ISPs are hosting multiple user sites on a single machine, and want their users to be able to alter their configuration.
However, in general, use of .htaccess files should be avoided when possible. Any configuration that you would consider putting in a .htaccess file, can just as effectively be made in a <directory>section in your main server configuration file."</directory>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam signals from old company site are hurting new company site, but we can't undo the redirect.
My client was forced to change its domain name last year (long story). We were largely able to regain our organic rankings via 301-redirects. Recently, the rankings for the new domain have begun to plummet. Nothing specific took place that could have caused any ranking declines on the new site. However, when we analyze links to the OLD site, we are seeing a lot of link spam being built to that old domain over recent weeks and months. We have no idea where these are coming from but they appear to be negatively impacting our new site. We cannot dismantle the redirects as the old site has hundreds, if not thousands, of quality links pointing to it, and many customers are accustomed to going to that home page. So those redirects need to stay in place. We have already disavowed all the spam we have found on the old Search Console. We are continuing to do so as we find new spam links. But what are we supposed to do about this spam negatively impacting our new site? FYI we have not received any messages in the search console.
White Hat / Black Hat SEO | | FPD_NYC1 -
Question about "sneaky" vs. non-sneaky redirects?
One of my client's biggest keyword competitors is using, what I believe to be, sneaky redirects. The company is a large, international corporation that has a local office. They use a totally unrelated domain name for local press and advertising, but there is no website. The anchor text in the backlinks automatically redirects to the corporate website. Is this sneaky or not?
White Hat / Black Hat SEO | | JCon7110 -
Magento Temporary Redirects?
Just checked my Crawl insights. I have 1981 on Moz 302 redirects - temp I'm not too familiar with Magento - however site domain has moved from .com to .co and although I have set a 301 redirect on the base domain through hta I am assuming it is also temporary redirecting things in the CMS itself? The temporary directs that the site is creating are still on the new domain - but are really odd! Eg Wishlist, Product compare links .co/wishlist/index/add/product/498/form_key/5e7CQkZ54tMSsJtwAnyone have ideas in regards to this?
White Hat / Black Hat SEO | | Kelly33301 -
Acquire domains to boost yours, how to redirect an acquired domain
What is the best way to redirect for best SEO benefits? Examples: glaspunt.nl -> glas.nl fietstassen.eu -> loodgieter.nl Any technical information how to (root) redirect for best SEO practices?
White Hat / Black Hat SEO | | remkoallertz0 -
Can I 301 redirect old URLs to staging URLs (ex. staging.newdomain.com) for testing?
I will temporarily remove a few pages from my old website and redirect them to a new domain but in staging domain. Once the redirection is successful, I will remove the redirection rules in my .htaccess and get the removed pages back to live. Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
What is the difference between the two rewrite rules in htaccess?
Force www. prefix in URLs and redirect non-www to www RewriteCond %{HTTP_HOST} !^www.domain.com.ph
White Hat / Black Hat SEO | | esiow2013
RewriteRule (.*) http://www.domain.com.ph/$1 [R=301,L] Force www. prefix in URLs and redirect non-www to www - 2nd option RewriteCond %{HTTP_HOST} ^domain.com.ph [NC]
RewriteRule (.*) http://www.domain.com.ph/$1 [R=301,L]0 -
All pages going through 302 redirect - bad?
So, our web development company did something I don't agree with and I need a second opinion. Most of our pages are statically cached (the CMS creates .html files), which is required because of our traffic volume. To get geotargeting to work, they've set up every page to 302 redirect to a geodetection script, and back to the geotargeted version of the page. Eg: www.example.com/category 302 redirects to www.example.com/geodetect.hp?ip=ip_address. Then that page 302 redirects back to either www.example.com/category, or www.example.com/geo/category for the geo-targeted version. **So all of our pages - thousands - go through a double 302 redirect. It's fairly invisible to the user, and 302 is more appropriate than 301 in this case, but it really worries me. I've done lots of research and can't find anything specifically saying this is bad, but I can't imagine Google being happy with this. ** Thoughts? Is this bad for SEO? Is there a better way (keeping in mind all of our files are statically generated)? Is this perfectly fine?
White Hat / Black Hat SEO | | dholowiski0 -
301 Redirect ASP code
Hi I have a script detailed below, that 301 redirects based upon different queries --- """"<%if (Request("offset") = "") Then%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") <> "" then'Sector and Location NOT NULL%> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") = "" then'Sector NOT NULL and Location NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") = "" AND Request("j") <> "" then'Sector NULL and Location NOT NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <%End if%>"""" But this still allows for both the www and non www versions of these pages to render in the browser, which is resulting in duplicate content. On my home page I use -- <% If InStr(Request.ServerVariables("SERVER_NAME"),"www") = 0 Then Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www." & Request.ServerVariables("HTTP_HOST") & "/" Response.End End if %> `Is there a good way to combine these, so that I still get all of the rules of the first script whilst also redirecting any non www versions to the www version? in other words
White Hat / Black Hat SEO | | TwoPints
domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation")))
Eould redirect to
www.domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Thanks in advance`0