Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What are the effects of having Multiple Redirects for pages under the same domain
-
Dear Mozers,
First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year !
I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too.
Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of.
How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time?
To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site?
Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed.
What are your opinions about this ?
-
Both answers so far get to one of the points that I was going to make, always update redirects so that there is not a chain, but I wanted to add something else. You only need redirects as long as someone is linking to those pages. You should be taking time to fix any internal references to changed URLs and contacting websites that link to the old URLs and asking them to change the URLs. That should be a part of any site URL change.
If you have only revised your URLs once, you only need redirects for 3-6 months while the search engines reindex everything. In that time, you should have changed all links to the old URLs.
In your case, I'd drop all old redirects except for the last one and see what 404s you get. Find the referring site, and contact them to change the link to your site. Once that is all done, then you can work on this latest revision to change those links.
Hope that helps!
-
It is always best to do a one to one redirect instead of a chain. As Federico said, there is some pagerank loss when doing a redirect (though the exact amount is debatable and may be neglible) and redirecting A to B to C compounds the problem. On top of that, too many redirects in a chain will lead Googlebot to stop crawling the chain. One or two is fine, three or more is not. In this older video http://youtu.be/r1lVPrYoBkA Matt Cutts started talking about redirect chains at around 2:48 and mentions that one, two and maybe three in a chain is fine. This Whiteboard Interview from 2010 with Matt Cutts http://moz.com/blog/whiteboard-interview-googles-matt-cutts-on-redirects-trust-more also states the 1 or 2 301s in a chain. So if you're redirecting A -> B -> C -> D -> E -> F... you're possibly hurting yourself. Where possible you should change the redirects so its A to F, B to F, C to F, D to F and E to F. As for removing the redirects after a certain number of months, I'd check to see how many people are still linking in with that older URL. You'd want to ask sites linking in to update to the newest URL before you 404 it and lose those links. And if you're still getting tons of direct traffic coming in on an old 301 then you might want to do some digging & research before you cut off that traffic. Odds are though after a few months you wouldn't be getting as much traffic coming through on the older URL but there is always the possibility.
-
Every time you make a redirect, 301, some of the pagerank is diluted. So following your example, from going from A to C you should redirect both A and B to C, not A -> B -> C as you double the loss.
Redirects are just fine, and in my opinion, they should say for as long as the pages being redirected still get organic traffic (backlinks, search, etc.). The moment you see no more traffic, and the links pointing to that redirected page fixed (point to the new page) you can safely remove the redirection. For as on the amount of redirects, it won't be a problem if you have lots of them, unless you do multiple redirects from A to G going from one page to the other until reaching the final, working version.
If that's not your scenario and A redirects directly to G, then you are fine. Monitor traffic on A and see if at some point you can remove the redirection, otherwise just leave is there (I personally have redirects that have been there for over 3 years as the pages are still getting organic traffic (mainly from links).
Hope that helps! And a happy new year to you too!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Markups on The Same Page - Best Solution?
Hi there! I have a website that is build in react javascript, and I'm trying to use markup on my pages. They are mostly articles about general topics with common questions (about the topic), and for most articles I would like to use two markups: article markup + FAQ Markup ( for the questions in the article) article markup + how-to markup Can I do this or will Google get confused? Since I have two @type at the same time, for example @type": "FAQPage" and "@type": "Article". How should I think? I'm using https://schema.dev/ right now. Thanks!
Intermediate & Advanced SEO | | Leowa0 -
Htaccess - Redirecting TAG or Category pages
Hello Fellow Moz's, We have an issue redirecting some /TAG and /Category pages to inner pages. As an example we use: RedirectMatch 301 /category/Sample-Category(.*) https://OurDomain.com.au/New-Page//$1 That works well. The issue is we have other categories and tags that are named similar to /Sample-Category As an example, if we try to redirect /Sample-Category-1 to /New-Page-1 - it will not work, and redirects to /New-Page I assume this is because /Sample-Category is already being redirected, so anything after /Sample-Category like -1 or -2 or -3 etc, will not be recognized. Anyone know of a workaround?
Intermediate & Advanced SEO | | Jes-Extender-Australia0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
How to combine 2 pages (same domain) that rank for same keyword?
Hi Mozzers, A quick question. In the last few months I have noticed that for a number of keywords I am having 2 different pages on my domain show up in the SERP. Always right next to each other (for example, position #7 and #8 or #3 and #4). So in the SERP it looks something like: www.mycompetition1.com www.mycompetition2.com www.mywebsite.com/page1.html
Intermediate & Advanced SEO | | rayvensoft
4) www.mywebsite.com**/page2.html**
5) www.mycompetition3.com Now, I actually need both pages since the content on both pages is different - but on the same topic. Both pages have links to them, but page1.html always tends to have more. So, what is the best practice to tell Google that I only want 1 page to rank? Of course, the idea is that by combining the SEO Juice of both pages, I can push my way up to position 2 or 1. Does anybody have any experience in this? Any advice is much appreciated.0 -
301 redirection pointing to noindexed pages
I have rather an unusual situation where a recently launched affiliate site does not have any unique content as its all syndicated content. For that reason we are currently using the noindex,nofollow meta tags to keep the pages out of the search engines index until we create unique content for the pages. The problem is that due to a very tight timeframe with rebranding, we are looking at 301 redirecting (on a page to page basis) another high authority legacy domain to this new site before we have had a chance to add unique content to it and remove the noindex,nofollow tags. I would assume that any link authority normally passed through the 301 would be lost in this scenario but Im uncertain of what the broader impact might be. Has anyone dealt with a similar scenario? I know this scenario is not ideal and I would rather wait until the unique content is up and noindex tags are removed before launching the 301 redirect of the legacy domain but there are a number of competing priorities at play outside of SEO.
Intermediate & Advanced SEO | | LosNomads0 -
When should you redirect a domain completely?
We moved a website over to a new domain name. We used 301 redirects to redirect all the pages individually (around 150 redirects). So my question is, when should we just kill the old site completely and just redirect (forward/point) the old domain over to the new one?
Intermediate & Advanced SEO | | co.mc0 -
What's the best way to redirect categories & paginated pages on a blog?
I'm currently re-doing my blog and have a few categories that I'm getting rid of for housecleaning purposes and crawl efficiency. Each of these categories has many pages (some have hundreds). The new blog will also not have new relevant categories to redirect them to (1 or 2 may work). So what is the best place to properly redirect these pages to? And how do I handle the paginated URLs? The only logical place I can think of would be to redirect them to the homepage of the blog, but since there are so many pages, I don't know if that's the best idea. Does anybody have any thoughts?
Intermediate & Advanced SEO | | kking41200 -
Cookies and redirects - what are the negative effects?
I am advising a client who wants to streamline their online customers experience through the use of cookies. The first time someone visits mysite.com, they will visit the normal index page, and on that page will be asked to identify themselves as a Personal or Business customer - and taken through to a relevant page. This will result in a cookie being added. The next time they come back to mysite.com, the cookie will automatically direct them from the index page to mysite.com/personal/ or mysite.com/business/. My question is, what are the SEO implications of this, especially given the fact the index page is their primary landing page for almost all organic traffic? Bots I realise that googlebot etc do not store cookies, so this should result in no change from the bots perspective (i.e. no redirect) but is it that simple? In effect we'll be showing the bot one thing and second time + visitors something else. Is this not effectively cloaking? All advice gratefully received!
Intermediate & Advanced SEO | | seomasters0