How do you explain the problem with several re-directs to a client?
-
I have a client who has done a lot of link building, and just migrated his site from an old platform to a more seo friendly one, but now he is moving pages on the new site.
Old Site --> (301 re-direct) --> New Site --> (301 re-direct) --> Changed Page -->(301 re-direct) Changed page again, etc
All his changes are making a lot of etra work for me every month and I feel he is wasting a lot of link juice,
How Would you explain to the client why they shouldn't be using several re-directs?
What can I do to make sure that they keep as much link juice as possible?
-
I have never worked for Google or any other search engine so I want to make it clear the below is my best understanding of how the process works, and I use it to base my actions upon. I feel my understanding is valid but the examples could probably use a bit of work. I am always willing to entertain other ideas.
Crawlers find and explore links. They capture data and record it in a database. That data is then processed by the search engine. If Page is A indexed, the URL will show in SERPs as Page A. If later you 301 redirect Page A to Page B, when the crawler discovers the 301 redirect the search engine will update the URL in SERPS to Page B. With me so far?
Later you decide to 301 redirect Page B to Page C. When the search engine recognizes the redirect (i.e. the crawler discovers it) the URL will once again be updated in SERPs to Site C. Any instances of the Page A or Page B URLs in the search engines database would be displayed as Page C in SERPs.
Despite the search engine's database having the correct URL to display in SERPs, crawlers are not provided this information. As long as link exists and a crawler can find it, the crawler will attempt to follow it, subject to normal factors such as nofollow, crawl budget, etc. If you modify the initial redirect from Page A to Page C, the crawler will detect the new header change and the search engine will update their records accordingly.
The above information was shared with respect to the appearance of the URL in SERPs, but it should be identical for the backlinks as well. Rather then forwarding the backlinks from Page A to Page B, those links would be directly forwarded to Page C.
So instead of it re-directing from A to B then C, we write a new redirect for A to C. Is this better? if so why?
If you modify the existing redirect to go from Page A to Page C, it is better because it is a single redirect. It is better for your servers (less redirects to process), better for users (quicker page loads), better for you (less redirects to manage and less opportunities for something to go wrong) and therefore better for search engines. You are rewarded for this improvement with your link juice flow being stronger.
-
Thanks Ryan,
Great Answer and illustration!
A follow up questions, what happens if you go back and change the old 301 re-directs?
So instead of it re-directing from A to B then C, we write a new redirect for A to C.
Is this better? if so why?
-
Multiple redirects is a really bad idea and should be corrected whenever possible. The consideration I ask clients to understand is how multiple redirects amplify the loss of link juice. The numbers I will use in the below example are simply how I explain it when asked, and I don't have any solid math to back it up. As we all know, the exact process is kept secret.
Redirect #1 = lose 10% link juice
Redirect #2 = 1st link loses 10%, 2nd link loses 10%x2=20%, total 30% loss
Redirect #3 = 1st link loses 10%, 2nd link loses 20%, 3rd link loses 30% = 60% loss
Redirect #4 = 100% loss.
Again the numbers are likely not that dramatic, but it helps get site owners out of the mindset of "well, a 301 loses just a drop of link juice so 3 or 4 redirects doesn't lose much". We know the trust factors for a site rapidly diminish in an amplified manner a few links away from the source. We know PR on a site evaporates almost completely 4 links into a site. Even top PR sites like DMOZ and Yahoo directory have pages not indexed because there is not enough PR passed through their links to pages on their site which are deep. It is logical to think this same concept applies to redirects. It is another form of following links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different language with direct translation: duplicate content, meta?
For a site that does NOT want a separate subdomain, or directory, or TLD for a country/language would the directly translated page (static) content/meta be duplicate? (NOT considering a translation of the term/acronym which could exist in another language) i.e. /SEO-city-state in English vs. /SEO-city-state Spanish -In this example a term/acronym that is the same in any language. Outside of duplicate content, are their other conflict potentials in rankings you can think of?
Intermediate & Advanced SEO | | bozzie3110 -
Problem with Google finding our website
We have an issue with Google finding our website: (URL removed) When we google "(keyword removed)" in google.com.au, our website doesn't come up anywhere. This is despite inserting the suitable title tag and onsite copy for SEO. We found this strange, and thought we'd investigate further. We decided to just google the website URL in google.com.au, to see if it was being properly found. Our site appeared at the top but with this description: A description for this result is not available because of this site's robots.txt – learn more. We also can see that the incorrect title tag is appearing. From this, we assumed that there must be an issue with the robot.txt file. We decided to put a new robot.txt file up: (URL removed) This hasn't solved the problem though and we still have the same issue. If someone could get to the bottom of this for us, we would be most appreciative. We are thinking that there may possibly be another robot.txt file that we can't find that is causing issues, or something else we're not sure of! We want to get to the bottom of it so that the site can be appropriately found. Any help here would be most appreciated!
Intermediate & Advanced SEO | | Gavo0 -
When a client clones there UK site copy for a US version....
Buonngiorno from 16 degrees C cloudy wetherby UK, A client has cloned their UK sites copy for a US version. What theyve now got is a USA site and a uK site with exactly the same copy, the only difference is the suffix. Am i right in saying this will cause problems when for example a browser enters a phrase and two sites appear in the SERPS. Is a solution to this to block the usa site from appearing in the UK (is this possible?). Yes i know the true fix is to change the copy but we are dealing with clients here 😉 Grazie,
Intermediate & Advanced SEO | | Nightwing
David0 -
Client has blog and main site - current thoughts - how to consolidate?
So I have several clients who have been blogging for a few years on Blogger and self hosted WordPress. They also have their "main site" on a different URL. What is the current thinking on what to do with the content. The "main sites" could use a bit of a boost and I know the content would help so I know I can 3-1 redirect everything from the current blogs to a new home on the main site. What I am thinking is to move most of the posts to the main site with redirects, but leave a few posts around perhaps a theme (and maybe writing a few more) and leaving that property up and "open for business" so the links from it have some value to the main site, we can get G-plus author attribution on several sites in their topic of experience and maybe we can get some extra pages to rank in top 7. Does this seem like a reasonable strategy?
Intermediate & Advanced SEO | | dot-B-dot-B0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Should I Re-Direct Based Upon IP
I am working on an e-commerce store. There are 4 subdomains for 4 seperate languages. ie uk.business.com, fr.business.com etc The client is asking that any French speaker (French IP address) is only shown the FR subdomain and so on for all languages. Is this the best way of doing things, or am I better off without the re-direct.
Intermediate & Advanced SEO | | eventurerob0 -
Can anyone explain what the blog network penalties news is about please?
I have seen the latest news about Google penalizing blog networks and as a newbie starting out am wondering if my multi-site plans would constitute as a blog network. My question is, is having a couple of external blogs on keyword rich domains pointing back at my primary domain name considered to be a blog network of the kind that Google is penalizing?
Intermediate & Advanced SEO | | Wallander0 -
Link anchor text: only useful for pages linked to directly or distributed across site?
As a SEO I understand that link anchor text for the focus keyword on the page linked to is very important, but I have a question which I can not find the answer to in any books or blogs, namely: does inbound anchor text 'carry over' to other pages in your site, like linkjuice? For instance, if I have a homepage focusing on keyword X and a subpage (with internal links to it) focusing on keyword Y. Does is then help to link to the homepage with keyword Y anchor texts? Will this keyword thematically 'flow through' the internal link structure and help the subpage's ranking? In a broader sense: will a diverse link anchor text profile to your homepage help all other pages in your domain rank thematically? Or is link anchor text just useful for the direct page that is linked to? All views and experiences are welcome! Kind regards, Joost van Vught
Intermediate & Advanced SEO | | JoostvanVught0