Redirect between domains: any real number on how much link juice is lost?
-
Hi,
I'm thinking of rebranding my website and moving it to a new domain.
Of course I would implement 301 redirects page to page from old-domain.com to new-domain.com. I wonder if you have any real figure based on your experiments on how much link juice I could lose in the process and if it will take time for Google to re-crawl correctly the new page.
I could get some of the backlinks changed as well, so they would point to the new domain. Cutts says it would get changed at least the more important, but how many? which are the more important?
Also, what about if I move just a part of the website that has no backlinks? Supposedly it won't have any link juice to pass through but of course all the pages will be hosted on a brand new domain that won't pass domain-power to those internal pages, so will I lose rankings for these pages?
Thanks for any help,
Best regards
-
How you phrased your question, I thought you were considering only moving pages that didn't have any backlinks, not the entire subfolder. You want to keep your URLs consistent, and it's pretty easy to build 301 redirects around an entire subfolder.
-
Thanks Erica, sounds reasonable.
I had seen that article, but just one experiment doesn't make it absolutely true.
I wonder why you said: "you want to move all of it". Why not moving just an entire subfolder ? Do you think could it harm rankings for the entire site or the part that has moved won't inherit link juice?
Thanks again,
-
301 is the best solution possible for moving your site to pass along your linkjuice. And you want to move all of it, not just pages with backlinks.
Many people have asserted that you can see up to a 10% dip initially. (Most people only see dips from a week to a month at most.) However, over the long-term, with a better site & URL structure, you should see a rise in traffic. I have no seen anyone doing a predictive modle on this.
Seer Interactive did this test 301 Redirect Test: How Much Link Juice are YOU Losing? which saw no rankings lost, but favorable outcomes in the long-term for the better site.
While I'd never put words in Cutts' mouth, I believe he was saying that if you can, getting your backlinks changed to your new site's URL is optimal. But obviously, this is not always possible. Instead, I'd concentrate on link building new links to your website on its new URL. (One should always be working on getting back links as part of your on-going SEO anyway.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Level Redirects - HTTP and HTTPS
About 2 years ago (well before I started with the company), we did an http=>https migration. It was not done correctly. The http=>https redirect was never inserted into the .htaccess file. In essence, we have 2 websites. According to Google search console, we have 19,000 HTTP URLs indexed and 9,500 HTTPS URLs indexed. I've done a larger scale http=>https migration (60,000 SKUs), and our rankings dropped significantly for 6-8 weeks. We did this the right way, using sitemaps, and http and https GSC properties. Google came out recently and said that this type of rankings drop is normal for large sites. I need to set the appropriate expectations for management. Questions: How badly is the domain split affecting our rankings, if at all? Our rankings aren't bad, but I believe we are underperforming our backlink profile. Can we expect a net rankings gain when the smoke clears? There are a number of other technical SEO issues going on as well. How badly will our rankings drop (temporarily) and for how long when we add the redirect to the .htaccess file? Is there a way to mitigate the rankings impact? For example, only submitting partial sitemaps to our GSC http property? Has anyone gone through this before?
Intermediate & Advanced SEO | | Satans_Apprentice0 -
Links: Links come from bizzare pages
Hi all, My question is related to links that I saw in Google Search Console. While looking at who is linking to my site, I saw that GSC has some links that are coming from third party websites but these third party webpages are not indexed and not even put up by their owners. It looks like the owner never created these pages, these pages are not indexed (when you do a site: search in Google) but the URL of these pages loads content in the browser. Example - www.samplesite1.com/fakefolder/fakeurl what exactly is this thing? To mention more details, the third party website in question is a Wordpress website and I guess is probably hijacked. But how does one even get these types pages/URLs up and running on someone else's website and then link out to other websites. I am concerned as the content that I am getting link from is adult content and I will have to do some link cleansing soon.
Intermediate & Advanced SEO | | Malika10 -
Site wide links - should they be nofollow or followed links
Hi We have a retail site and a blog that goes along with the site. The blog is very popular and the MD wanted a link from the blog back to the main retail site. However as this is a site wide link on the blog, am I right in thinking this really should be no follow link. The link is at the top of every page. Thanks in advance for any help
Intermediate & Advanced SEO | | Andy-Halliday0 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Should 301 Redirects be used only in cross domains or also internally?
In the following video with Cutts: http://youtu.be/r1lVPrYoBkA he explains a bit more about 301 redirects but he only talks about cross sites. What about redirecting internally from a non-existing product in a store to a new similar existing product?
Intermediate & Advanced SEO | | BeytzNet0 -
Where do I redirect a domain to strengthen another domain?
I've got a UK domain that I need to redirect to a US domain. Should I point it to the root domain or a landing page off the root and what it the benefit to doing one over the other?
Intermediate & Advanced SEO | | JCorp0 -
Link Request Email on Site`s Link Pages
Hello I have assembled a list of web-sites that have "Links" section that has a list of persons` favorite tools. Those pages have a link to my competitor. I know my tool is just as good if not better and want to request a link. I`m thinking of sending an email asking for a link and offering a small amount of money for it. Questions: A) How much should I offer? Should I offer anything at all B) Is there an email style that someone can suggest that has been tested and proven to work for this type of situtation?
Intermediate & Advanced SEO | | hellopotap0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0