What are the effects of having Multiple Redirects for pages under the same domain
-
Dear Mozers,
First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year !
I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too.
Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of.
How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time?
To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site?
Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed.
What are your opinions about this ?
-
Both answers so far get to one of the points that I was going to make, always update redirects so that there is not a chain, but I wanted to add something else. You only need redirects as long as someone is linking to those pages. You should be taking time to fix any internal references to changed URLs and contacting websites that link to the old URLs and asking them to change the URLs. That should be a part of any site URL change.
If you have only revised your URLs once, you only need redirects for 3-6 months while the search engines reindex everything. In that time, you should have changed all links to the old URLs.
In your case, I'd drop all old redirects except for the last one and see what 404s you get. Find the referring site, and contact them to change the link to your site. Once that is all done, then you can work on this latest revision to change those links.
Hope that helps!
-
It is always best to do a one to one redirect instead of a chain. As Federico said, there is some pagerank loss when doing a redirect (though the exact amount is debatable and may be neglible) and redirecting A to B to C compounds the problem. On top of that, too many redirects in a chain will lead Googlebot to stop crawling the chain. One or two is fine, three or more is not. In this older video http://youtu.be/r1lVPrYoBkA Matt Cutts started talking about redirect chains at around 2:48 and mentions that one, two and maybe three in a chain is fine. This Whiteboard Interview from 2010 with Matt Cutts http://moz.com/blog/whiteboard-interview-googles-matt-cutts-on-redirects-trust-more also states the 1 or 2 301s in a chain. So if you're redirecting A -> B -> C -> D -> E -> F... you're possibly hurting yourself. Where possible you should change the redirects so its A to F, B to F, C to F, D to F and E to F. As for removing the redirects after a certain number of months, I'd check to see how many people are still linking in with that older URL. You'd want to ask sites linking in to update to the newest URL before you 404 it and lose those links. And if you're still getting tons of direct traffic coming in on an old 301 then you might want to do some digging & research before you cut off that traffic. Odds are though after a few months you wouldn't be getting as much traffic coming through on the older URL but there is always the possibility.
-
Every time you make a redirect, 301, some of the pagerank is diluted. So following your example, from going from A to C you should redirect both A and B to C, not A -> B -> C as you double the loss.
Redirects are just fine, and in my opinion, they should say for as long as the pages being redirected still get organic traffic (backlinks, search, etc.). The moment you see no more traffic, and the links pointing to that redirected page fixed (point to the new page) you can safely remove the redirection. For as on the amount of redirects, it won't be a problem if you have lots of them, unless you do multiple redirects from A to G going from one page to the other until reaching the final, working version.
If that's not your scenario and A redirects directly to G, then you are fine. Monitor traffic on A and see if at some point you can remove the redirection, otherwise just leave is there (I personally have redirects that have been there for over 3 years as the pages are still getting organic traffic (mainly from links).
Hope that helps! And a happy new year to you too!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Page Depth
Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://moz.com/blog?page=2 treated to be on the same level in terms of page depth as https://moz.com/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
Changing Domain Host and Potential SEO Effects
Hi Community, We are currently thinking of changing our domain hosting package and I am wondering if that could have any effects on rankings, SEO, etc. The thing here is that we aren't changing the hosting provider, we are only changing the package from the same provider. Most of the info I have found on the internet has been mostly about changing actual hosting providers versus packages from the same provider (if the makes sense). If anyone knows if this might affect our SEO and what steps we can take to safe guard against any potential pitfalls, please let us know. Thanks
Intermediate & Advanced SEO | | Brian_Dowd0 -
Redirecting just the homepage of a site to another domain- good/bad idea?
TLDR: As part of a corporate rebranding/restructuring, my parent company is asking me to redirect just the homepage of our website to another page on their website. How will this affect rankings of all of the other pages on our site? I work for an organization (XYZ Corp) that is owned by another company (Big Conglomerate). XYZ Corp's main function is building custom skinned microsites for marketing purposes that live on our domain in a traditional directory structure (no subdomains). This morning, I get a request to redirect XYZ Corp's homepage to live at bigconglomerate.com/xyzcorp. But all of our original microsites are to remain as is. Technically, I know how to accomplish this redirection. My question is- should I? Or should I fight this? I searched previous Q&A's, but wasn't able to find someone else who was concerned about losing search rankings for sub-pages due to losing their website's homepage. A few more details- The microsite pages are not linked to from the homepage. The microsites do not link back to the homepage. We cannot move the microsites to bigconglomerate.com because everything that lives there is a cookie cutter CMS page. This is my first question ever, please go easy on me! Thanks, --Mark
Intermediate & Advanced SEO | | bigwheeler0 -
Is it worth redirecting an old domain name which was hacked to my new website?
I had a website which got hacked and malware added to it. I have since closed that website down but I still have the domain name. That domain name prior to the malware was incredibly well ranking for its niche and had a good range of high quality links to it and a domain age of 6 years. I'm now creating a new website which is similar to the old one (the same but with a different platform and layout). Is it a good or bad idea to redirect the old domain name to the new website?
Intermediate & Advanced SEO | | james.rose0 -
How to do a site migration followed by a domain migration and avoid 301 redirect chains?
Hi all, The current roadmap for our Eng team has us performing a site migration (redirecting one subfolder to another subfolder) and then a domain migration shortly after. The way I see it, I have 2 scenarios (the 1st involves the site migration THEN the domain migration and the 2nd is the site migration and domain migration being done simultaneously): olddomain.com/subfolder-old to olddomain.com/subfolder-new THEN olddomain.com/subfolder-new to newdomain.com/subfolder-new AND olddomain.com/subfolder-old to newdomain.com/subfolder-new olddomain.com/subfolder-old to newdomain.com/subfolder-new I also understand that there are two best practices for a domain migration and they are 1) keep everything the same that you can to help Google understand it is the same page, just on a different domain and 2) avoid chain redirects. As you can imagine, scenario 1 requires more Eng costs than scenario 2. So, my question is, is scenario 2 a perfectly viable option or should I make the push to go for scenario 1? Any advice is greatly appreciated!
Intermediate & Advanced SEO | | brad-causes1 -
Redirecting non-www pages to www ones
Hello:
Intermediate & Advanced SEO | | romanbond
I'm trying to consolidate all the link juice and see that some of my pages are linked to by using both www.mysite.com/whatever.html and mysite.com/whatever.html.
Is there a safe re-write rule that not just redirects non-www(s) to www(s), but designates the redirect as 301, so link juice will be transfered as well. If not RewriteRule, are there any other ways to accoplishe this? And the last question: can this be solved by simply setting Preffered domain in google webmaster tools to display www URL? Any help will be appreciated.0 -
Hyphen domain effect SEO?
Hi Guys, I am looking to buy some domain that have the keyword I want in - but my question is; Does using hypehns in a domain effect your SEO? Thanks Gareth
Intermediate & Advanced SEO | | GAZ090 -
redirect 404 pages to homepage
Hello, I'm puting a new website on a existing domain. In order to not loose the links that point to the varios old url I would like to redirect them to homepage. The old website was a mess as there was no seo and the pages didn't target any keywords. Thats why I would like to redirect all links to home. What do you think is the best way to do this ? I tried to ad this in the .htaccess but it's not working; ErrorDocument 404 /index.php Con you tell me how it exacly look? Now the hole file is like this: @package Joomla @copyright Copyright (C) 2005 - 2012 Open Source Matters. All rights reserved. @license GNU General Public License version 2 or later; see LICENSE.txt READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! The line just below this section: 'Options +FollowSymLinks' may cause problems with some server configurations. It is required for use of mod_rewrite, but may already be set by your server administrator in a way that dissallows changing it in your .htaccess file. If using it causes your server to error out, comment it out (add # to beginning of line), reload your site in your browser and test your sef url's. If they work, it has been set by your server administrator and you do not need it set here. Can be commented out if causes errors, see notes above. Options +FollowSymLinks Mod_rewrite in use. RewriteEngine On Begin - Rewrite rules to block out some common exploits. If you experience problems on your site block out the operations listed below This attempts to block the most common type of exploit attempts to Joomla! Block out any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]([^)]) [OR] Block out any script that includes a
Intermediate & Advanced SEO | | igrizo0