Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to handle potentially thousands (50k+) of 301 redirects following a major site replacement
-
We are looking for the very best way of handling potentially thousands (50k+) of 301 redirects following
a major site replacement and I mean total replacement.Things you should know
Existing domain has 17 years history with Google but rankings have suffered over the past year and yes we know why. (and the bitch is we paid a good sized SEO company for that ineffective and destructive work)
The URL structure of the new site is completely different and SEO friendly URL's rule. This means that there will be many thousands of historical URL's (mainly dynamic ones) that will attract 404 errors as they will not exist anymore. Most are product profile pages and the God Google has indexed them all. There are also many links to them out there.
The new site is fully SEO optimised and is passing all tests so far - however there is a way to go yet.So here are my thoughts on the possible ways of meeting our need,
1: Create 301 redirects for each an every page in the .htaccess file that would be one huge .htaccess file 50,000 lines plus - I am worried about effect on site speed.
2: Create 301 redirects for each and every unused folder, and wildcard the file names, this would be a single redirect for each file in each folder to a single redirect page
so the 404 issue is overcome but the user doesn't open the precise page they are after.
3: Write some code to create a hard copy 301 index.php file for each and every folder that is to be replaced.
4: Write code to create a hard copy 301 .php file for each and every page that is to be replaced.
5: We could just let the pages all die and list them with Google to advise of their death.
6: We could have the redirect managed by a database rather than .htaccess or single redirect files. Probably the most challenging thing will be to load the data in the first place, but I assume this could be done programatically - especially if the new URL can be inferred from the old.Many be I am missing another, simpler approach - please discuss
-
Sorry to hear of your woes.
Depending on the structure of the URLS you could create some simple pattern matches rules within .htaccess? If you could a few dozen rules could handle many thousands of redirects. If there isn't any easily identifiable pattern to match then a DB will, indeed, be your best option.
One of the web devs I used to work with (who was considerably smarter than me) faced a similar issue (with a 'mere' 10k+ redirects) and used some Ruby on Rails middleware as a redirector: This may have been the solution he used:
https://github.com/vigetlabs/redirector
I hope that helps.
I hope you're able to get this sorted without too much pain. Good Luck!
-
Thank for the very quick response - you have picked my favourite solution. It will be interesting to hear other views and comments.
-
Hi,
1. Usually won't work and with 50k extra rules in your htaccess file it will for sure slow down the site as for every request to your server it has to go through the htaccess file.
For now I would recommend going with 6. with the information that you've provided. By doing it like this you can do a very quick check on your database and also in the request have the user send to the right page.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
All URLs in the site is 302 redirected to itself
Hi everyone, I have a problem with a website wherein all URLs (homepage, inner pages) are 302 redirected. This is based on Screaming Frog crawl. But the weird thing is that they are 302 redirected to themselves which doesn't make any sense. Example:
Intermediate & Advanced SEO | | alex_goldman
https://www.example.com.au/ is 302 redirected to https://www.example.com.au/ https://www.example.com.au/shop is 302 redirected to https://www.example.com.au/shop https://www.example.com.au/shop/dresses is 302 redirected to https://www.example.com.au/shop/dresses Have you encountered this issue? What did you do to fix it? Would be very glad to hear your responses. Cheers!0 -
Splitting and moving site to two domains - How to redirect
I have a client who is going to split their retail and wholesale business and rebrand the retail biz. So let’s say they are going to move everything from currentdomain.com to either retaildomain.com or wholesaledomain.com. The most important business for them is the retail site, so they want to pass on as much ranking power as they can from currentdomain.com to retaildomain.com. I see two choices here: We can 301 redirect all of currentdomain.com to retaildomain.com, and then redirect any wholesale pages to wholesaledomain.com. The advantage is that we can use GSC’s change of address tool to report the change to Google. The downside is that there is a redirect chain (2 hops) to wholesaledomain.com. Would this confuse Google? Or we can 301 redirect page by page from currentdomain.com to the appropriate page on either new site. This means no redirect chains but it also means that we can’t use GSC’s change of address tool. Which would you do and why? And is there another option that I'm missing? I appreciate any insights you can share.
Intermediate & Advanced SEO | | rich.owings1 -
Several 301 Redirects to Same Page
Hi, I have 3 Pages we won't use anymore in our website. Let's call them url A, url B and url C. To keep their SEO strength on our domain, I've though about redirecting all of them to url D. For what I understand, when 301 redirecting, about 85-90% of the link SEO juice is passed. Then, if I redirect 3 URLs to the same page... does url D receive all the link SEO juices for URLs added up? (approximately)
Intermediate & Advanced SEO | | viatrading1
e.g. future url D juice = 100% current url D juice + 85% url A juice + 85% url B juice + 85% url C juice Is this the best practice, or is there a better way? Cheers,0 -
Proper 301 in Place but Old Site Still Indexed In Google
So i have stumbled across an interesting issue with a new SEO client. They just recently launched a new website and implemented a proper 301 redirect strategy at the page level for the new website domain. What is interesting is that the new website is now indexed in Google BUT the old website domain is also still indexed in Google? I even checked the Google Cached date and it shows the new website with a cache date of today. The redirect strategy has been in place for about 30 days. Any thoughts or suggestions on how to get the old domain un-indexed in Google and get all authority passed to the new website?
Intermediate & Advanced SEO | | kchandler0 -
Geoip redirection, 301 or 302?
Hello all Let me first try to explain what our company does and what it is trying to achieve. Our company has an online store, sells products for 3 different countries, and two languages for each country. Currently we have one site, which is open to all countries, what we are trying to achieve is make 3 different stores for these 3 different countries, so we can have a better control over the prices in each country. We are going to use Geoip to redirect the user to the local store in his country. The suggested new structure is to add sub-folders as following: www.example.com/ca-en
Intermediate & Advanced SEO | | ajarad
www.example.com/ca-fr
www.example.com/us-en
... If a visitor is located outside these 3 countries, then she'll be redirected to the root directory www.example.com/en We can't offer to expand our SEO team to optimize new pages for the local market, it's not the priority for now, the main objective now is to be able to control the prices for different market. so to eliminate the duplicate issue, we'll use canonical tags. Now knowing our objective from the new URL structure, I have two questions: 1- which redirect should we use? 301, 302?
If we choose 301, then which version of the site will get the link juice? (i.e, /ca-en or /us-en?)
if we choose 302, then will the link juice remain in the original links? is it healthy to use 302 for long term redirections? 2- Knowing that Google bots comes from US-IP, does that mean that the other versions of the site won't be crawled (i.e, www.example.com/ca-fr), this is especially important for us as we are using AdWords, and unindexed pages will effect our quality score badly. I'd like to know if you have other account structure in your mind that would be better than this proposed structure. Your help is highly highly appreciated.
Thanks in advance.0 -
How to stop Google crawling after 301 redirect?
I have removed all pages from my old website and set 301 redirect to new website. But, I have verified old website with Google webmaster tools' HTML verification file which enable me to track all data and existence of pages in Google search for my old website. I was assumed that, Google will stop crawling and DE-indexed all pages after 301 redirect. Because, I have set 301 redirect before 3 months. Now, I'm able to see Google bot activity on my website with help of Google webmaster tools. You can find out attachment to know more about it. How can it possible & How Google can crawl removed pages? You can see following image to know more about it. First & Second
Intermediate & Advanced SEO | | CommercePundit0 -
301 Redirects After Company Acquisition
We recently acquired a company, and now we are going to redirect all of the pages on their site to their respective pages on our site. Do we need to keep the original pages on their site active? For how long? Ideally, we would like to redirect everything and remove the old site entirely so we don't have to pay to keep hosting it. Is this possible? Thanks!
Intermediate & Advanced SEO | | pbhatt1