Redirecting One Page of Content on Domain A to Domain B
-
Let's say I have a nice page of content on Domain A, which is a strong domain. That page has a nice number of links from other websites and ranks on the first page of the SERPs for some good keywords.
However, I would like to move that single page of content to Domain B using a 301 redirect. Domain B is a slightly weaker domain, however, it has better assets to monetize the traffic that visits this page of content.
I expect that the rankings might slip down a few places but I am hoping that I will at least keep some of the credit for the inbound links from other websites.
Has anyone ever done this? Did it work as you expected? Did the content hold its rankings after being moved?
Any advice or philosophical opinions on this?
Thank you!
-
Thanks bookworm! Interesting ideas.
-
Would it not be easier to get the lead gen / advertising / monetization stuff to transfer over to the other domain as well? Presumably if its a choice-of-where-to-advertise thing you can influence them to see the light; if it's that B has a stronger brand / greater ability to convert, then perhaps a moderate rebranding to tie the two together (and thus use testimonials from A on B can work?) Easier to control logos and colors than google's algo...
-
Jordan, Thank you for sharing your experience on this. Glad to hear you are getting good results.
E
-
I look forward to seeing more input on this issue too. I am having the same issue as we speak. One of my smaller (but older) network sites is outranking the bigger site for a particular (big) keyword that I want to rank for. I keep thinking about doing something with it, but I am hesitant. I am also a bit greedy, as I think I could get both sites on the first page with some more work.
For now I am going to go back and optimize the smaller site so that it will be more suited for monetization.
-
I have done numerous single page 301's and had pretty promising results. Much of the link juice seemed to flow uninhibited and after a few weeks SERP rankings started to see pretty "explosive"(very contextually used here) growth. Similar results to general content migration, but you can get the Google juice flowing a bit better having using a 301 "seed" thats already established.
-
I thought about leaving it there and launching brand new content on Domain B... but need to get the action on the more appropriate domain before summer traffic arrives.
-
Ha! I was in the same situation recently and eventually decided to keep the page where it is and earn links towards the new resource rather than fiddling with 301 redirects. The reason was that both pages could convert (on both domains). I could have done a canonical and point to the page B, but that didn't seem fair either so I just left it as it is.
-
Thanks Steve,
I agree.
I have redirected a few domains with good to great results... just never tried it with a single page.
-
I suppose it's not really much difference from moving a website from one domain to another, it's just that it's a page instead of a whole site. I've done that before under instruction from clients. I cautioned against it due to the older domain having a bit of trust through age and the new domain being brand new, but thankfully for them their older domain didn't rank that well anyway so it wasn't too much of a loss. I guess it happens any time a company changes its name and therefore its domain for branding purposes. In that sense I suppose it all depends on just how strong the new domain is compared to the older one whether it's a good idea or not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking 2 pages on the same domain in the same SERP
I thought it was generally said that Google will favour 1 page per domain for a particular SERP, but I have seen examples where that is not the case (i.e. Same domain is ranking 2 different pages on the 1st page of the SERPs...) Are there any "tricks" to taking up 2 first page SERP positions, or am I mistaken that this doesn't always happen?
Intermediate & Advanced SEO | | Ullamalm0 -
Duplicate Page Content - Shopify
Moz reports that there are 1,600+ pages on my site (Sportiqe.com) that qualify as Duplicate Page Content. The website sells licensed apparel, causing shirts to go into multiple categories (ie - LA Lakers shirts would be categorized in three areas: Men's Shirts, LA Lakers Shirts and NBA Shirts)It looks like "tags" are the primary cause behind the duplicate content issues: // Collection Tags_Example: : http://www.sportiqe.com/collections/la-clippers-shirts (Preferred URL): http://www.sportiqe.com/collections/la-clippers-shirts/la-clippers (URL w/ tag): http://sportiqe.com/collections/la-clippers-shirts/la-clippers (URL w/ tag, w/o the www.): http://sportiqe.com/collections/all-products/clippers (Different collection, w/ tag and same content)// Blog Tags_Example: : http://www.sportiqe.com/blogs/sportiqe/7902801-dispatch-is-back: http://www.sportiqe.com/blogs/sportiqe/tagged/elias-fundWould it make sense to do 301 redirects for the collection tags and use the Parameter Tool in Webmaster Tools to exclude blog post tags from their crawl? Or, is there a possible solution with the rel=cannonical tag?Appreciate any insight from fellow Shopify users and the Moz community.
Intermediate & Advanced SEO | | farmiloe0 -
Redirecting a redirect - thoughts?
Hi! A client has just had 14k 404s pop up in his WMT. I think this is because a page that they redirected to had moved. My question is, can I clean these up by redirecting the page the original redirect was one? If so, will it have any negative impact?
Intermediate & Advanced SEO | | neooptic0 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0 -
Multiple 301 Redirects for the Same Page
Hi Mozzers, What happens if I have a trail of 301 redirects for the same page? For example,
Intermediate & Advanced SEO | | Travis-W
SiteA.com/10 --> SiteA.com/11 --> SiteA.com/13 --> SiteA.com/14 I know I lose a little bit of link juice by 301 redirecting.
The question is, would the link juice look like this for the example above? 100% --> 90% --> 81% -->72.9%
Or just 100% -----------------------------------------> 90% Does this link juice refer to juice from inbound links or links between internal pages on my site? Thanks!0 -
Sites with dynamic content - GWT redirects and deletions
We have a site that has extremely dynamic content. Every day they publish around 15 news flashes, each of which is setup as a distinct page with around 500 words. File structure is bluewidget.com/news/long-news-article-name. No timestamp in URL. After a year, that's a lot of news flashes. The database was getting inefficient (it's managed by a ColdFusion CMS) so we started automatically physically deleting news flashes from the database, which sped things up. The problem is that Google Webmaster Tools is detecting the freshly deleted pages and reporting large numbers of 404 pages. There are so many 404s that it's hard to see the non-news 404s, and I understand it would be a negative quality indicator to Google having that many missing pages. We were toying with setting up redirects, but the volume of redirects would be so large that it would slow the site down again to load a large htaccess file for each page. Because there isn't a datestamp in the URL we couldn't create a mask in the htaccess file automatically redirecting all bluewidget.com/news/yymm* to bluewidget.com/news These long tail pages do send traffic, but for speed we only want to keep the last month of news flashes at the most. What would you do to avoid Google thinking its a poorly maintained site?
Intermediate & Advanced SEO | | ozgeekmum0 -
How many pages to 301 Redirect
Hi Mozzers, My site has 11,200 pages indexed in Google and I'm looking to remove some of the lesser content which should probably have been picked up by Panda. However these pages work out to about 1,100 in total and I'm not sure whether to remove these bit by bit or just do it in one fell swoop? Does Google not like a site's indexed pages fluctuating too quickly? Are there any other considerations I should be aware of? Thanks!
Intermediate & Advanced SEO | | panini0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0