Thousands of 301 redirections - .htaccess alternatives?
-
Hi guys,
I just want to ask if there are other possible issues/problems (other than server load) once we implement 301 redirections for 10,000+ URLs using .htaccess. Are there other alternatives?
-
Thank you for your answer ! I will share it with our IT team.
-
Why don't you just have a VPS server with NGINX the stream handler/reverse proxy for your IIS web server?
- https://www.digitalocean.com/community/tutorials/how-to-set-up-nginx-load-balancing
- http://www.iborgelt.com/windows-home-server-behind-nginx-reverse-proxy/
You're just using the VPS as an interface to handle your redirects and for $5 a month. You can't beat it. Im sure if your IT department googles: nginx reverse proxy iis they will get the idea.
-
Hi guys, I have a similar problem, but on IIS7. Our IT department says our 301 redirections file is at it's max size in the webconfig. They could increase the limit, but says it will impact page load speed negatively. What's the impact on page speed of having 5000 to 10000 urls in the rewrite map ?
Also, they're also looking at a solution to look at the redirections only when the site gives a 404, so it would hit 404, then 301, then 200. I am a little scared of this SEO wize. Would it be a problem?
Thanks !
-
Putting aside server load / config issues, and from the pure SEO point of view.
No, you shouldn't have any major issues with that many 301s. However, what you might find is that depending on the size of your site and the frequency of Googlebots visits that some of these pages take a long time (months) to drop out of the index and be replaced by their newer alternatives. This normally isn't cause for alarm.
In some instances you might end up with pages that now have now links to them (as their parent categories were all redirected also) and so seem to get stuck and never get recrawled by Google to update. In a couple of instances I have had success using XML sitemap files that just include these 'blocked' pages (the old URLs still in the index) to prompt Google to recrawl them.
Also there is Google Webmaster Tools feature to 'crawl as Googlebot' which then prompts you to 'submit to index' which you can use to prompt recrawls on a per-page basis (but you have credits here, so should only be for the more important pages).
Best of luck!
-
The main benefit of this would be in reducing server load / response time, and potentially in maintainability of the server config.
The most important aspect of this side of thing would be based on how many separate rules you have in your .htaccess file for those 10,000 redirects.
-
Hi Kevin,
What's the difference of this method to the standard 301 redirection using .htaccess?
-
Do you guys have a step-by-step guide in implementing 301 redirection using this httpd main server config file?
-
Well, if you're on a VPS/Dedicated Machine. - I would take a look at http://httpd.apache.org/docs/current/rewrite/rewritemap.html
RewriteMap has 0 effect on the load time like if you were to have the same in .htaccess it will eat those redirect rules. Remember 301s cache in the browser so when you're testing have them all 302s until you're happy and then watch your rewrite log when you launch. If you need help let us know.
This does take some knowhow and learning but you should be able to get this done in a few days. ( testing, reading documentation )
-
Do you have access to the httpd main server config file? If so, please read Apache HTTP Server Tutorial: .htaccess files.
".htaccess files should be used in a case where the content providers need to make configuration changes to the server on a per-directory basis, but do not have root access on the server system. In the event that the server administrator is not willing to make frequent configuration changes, it might be desirable to permit individual users to make these changes in .htaccess files for themselves. This is particularly true, for example, in cases where ISPs are hosting multiple user sites on a single machine, and want their users to be able to alter their configuration.
However, in general, use of .htaccess files should be avoided when possible. Any configuration that you would consider putting in a .htaccess file, can just as effectively be made in a <directory>section in your main server configuration file."</directory>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Image redirection: Will it helps or hurts?
Hi all, There are some old images (non-existing now) from our website which have backlinks. We would like to redirect them to some live images to reclaim the backlinks. Is this Okay or sounds suspicious to Google? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Multiple redirects for GA tracking
We recently replaced a high traffic online service with a new one that now resides at a new URL. We redirect the old site (https://subdomain.mysite.org) to a static page announcing the change (http://www.mysite.org/announcement.html) that links out to the new online service. The SSL cert on the old site is valid for two more months and then would cost $1K to renew. We'd like to measure traffic from the old link over the next two months to see if it's worth renewing the SSL cert to keep a redirect going. If I go into GA, filter the "announcement.html" page and set the secondary dimension to "referral path" I'm not seeing any traffic from https://subdomain.mysite.org. Guessing this is part of the "(not set)" group. First thought was to have that go to a unique intermediary page to log the referral, which then redirects out to the announcement page. Is this considered spammy or is there another way to track referrals from the https site that I'm not considering? Thanks.
White Hat / Black Hat SEO | | c2g0 -
URL structure: 301 redirect or leave as is?
Hello, My website, www.coloringbookfun.com is very old and authoritative, but the URL structure is terrible. If you check out some of our subcategories such as http://www.coloringbookfun.com/Kung Fu Panda and individual printables such as http://www.coloringbookfun.com/Kung Fu Panda/imagepages/image2.html You can see that they aren't optimized. I am curious to know the pros and cons of fixing the URL structure and 301ing them to the new optimized url. Will 301ing lose authority and backlinks for the sites pages? Does optimizing the url structure outweigh losing the authority/backlinks?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Goddady's Domain Masking and 301's
I have a client who's 7 domains and single website (instantpages®) exists within the clutches of GoDaddy. They own 6 kewyord rich domain names that 301 redirect with masking to the main branded domain. In effect, what this provides is the ability to add a title tag and meta description for a keyword rich domain name that displays content through an iframe. So really it's not duplicate content but this practice sets off my spidey sense that this is not a best practice regarding SEO. I want to suggest for the client to drop the idea of masking and do a straight 301 redirect to main branded domain. I'm sure that is fine but these domains are Not similar variations but actually vary widely: massage-city.com, city-massage.com, city-acupuncture.com, acupuncture-city.com, city-chiropractic.com, chiropractic-city.com etc ---- Doesn't Google frown on redirecting 6 domains to a single domain if they vary widely? Words of wisdom appreciated.
White Hat / Black Hat SEO | | superZj0 -
2 Questions about 301 Redirects
So I have a couple of questions about 301 redirects: Do Google penalties EVER pass through a 301? I've done 20+ domain 301s in the last year and have yet to see it happen, but the other day I read a an article (or maybe it was a QA post?) that suggested doing 302s to avoid transferring penalties. Has anyone seen any authoritative information regarding this? I 301'd a domain in February that another SEO firm had built a lot of spammy links and I began building contextual links for it at a very slow rate (like 10 or so a month). Within a month, my domain authority was a 26 on the new domain and my inbound links were non existent. By month 2, my links were 70k and domain authority was 34. By month 3, down to 25k inbound links and domain authority of 29, where it has settled for the last 3 months despite some really high quality links. My question (don't worry it's coming), is does anyone have any clue why my links shot up so quickly and then dropped? I'm assuming the 301 links kicked in and then only about 45% ended up 'sticking'?? Thanks in advance
White Hat / Black Hat SEO | | BrianJGomez0 -
Redirecting old domains for SEO ranking?
It's been a while since I read anything seriously out of the box on SEO but I thought I would see what others thought of the bold assertions made in this article. Most of it revolves around buying expired domains and using a 301 to point them, and their juice, to new sites. This guy makes a living doing this so he has to know a bit more than the average Joe but I'm wondering where the other shoe is and when it drops.
White Hat / Black Hat SEO | | Highland0 -
302 redirects - redirecting numerous domains into main primary domain
302 Redirects - We are a digital agency carrying out some SEO analysis for a potential client. The client has bought over 150 different domains and redirected (302) them into his main domain. The domains were bought up based on relevant industry keywords and protection. On first instance this seems like a Black hat technique that Google would most definitely punish - buying up domains and redirecting them to main website. Does anyone have any thoughts on this? Thanks...
White Hat / Black Hat SEO | | Seanbain0 -
Server Side Java Script Redirects
I would like to use a redirect through a server based Java script to redirect visitors only referenced from a certain site. So let's say anyone clicking on a link to my site page-A from seomoz.org would automatically be redirected to page-B. All other users as well as direct and search engine traffic would only see the regular page A. The reason I am doing this is because the linking site is linking to page A which doesn't serve the user the correct content. Rather than contacting the webmaster to change the link to point to page -B, I want to redirect them. Is there any danger of Google penalizing this for cloaking? and how would they be able to tell?
White Hat / Black Hat SEO | | zachc_coffeeforless.com0