How far can I push rel=canonical?
-
My plan: 3 sites with identical content, yet--wait for it--for every article whose topic is A, the pages on all three sites posting that article will have a rel=canonical tag pointing to Site A. For every article whose topic is B, the pages on all three sites posting that article will have a rel=canonical tag pointing to Site B.
So Site A will have some articles about topics A, B, and C. And for pages with articles about A, the rel=canonical will point to the page it's on. Yet for pages with articles about B, the rel=canonical will point to the version of that article on site B. Etc.
I have my reasons for planning this, but you can see more or less that I want each site to rank for its niche, yet I want the users at each site to have access to the full spectrum of articles in the shared articles database without having to leave a given site.
These would be distinct brands with distinct Whois, directory listings, etc. etc.
The content is quality and unique to our company.
-
I think I'd start slowly in that case. Keep the relationship aspect in mind, too. Even if all three companies know the writer/client and are aware of the relationship, sooner or later one of these articles is going to take off. If one site gets the SEO credit and the other two sites aren't ranking, there may be friction. Even if the work is spread out evenly and all high-quality, you don't control (ultimately) what content finally sticks and is successful. I just think things could get weird all-around if you send every article three places and only one gets credit.
-
These are technically different companies with different products, all of which are in the securities industry. They are each founded by different groups of individuals, however my client is common among them and happens to be a fantastic writer. Many of the articles would add value to the readers of some of the other sites. I am hoping to develop a common command center so that in the editor for a given article he is able to just check off which of his sites the article will be published at, and which is to be considered canonical. So the sites will have different aesthetics and navigation, product pages, and other company-specific content, and not every article will show up on every site, however many will show up at multiple sites.
The idea of phasing in common articles with the cross-domain canonical strikes me as wise, and then just noindexing the non-canonical versions if I run into trouble.
-
Ah, understood. So, yes, in theory cross-domain canonical does handle this. I know major newspapers that use it for true syndication. There is risk, though, depending on the sites and content, and there is a chance Google will ignore it (moreso than in-domain canonical). So, I mostly wanted you to be aware of those risks.
META NOINDEX is safer, in some respects (Google is more likely to honor it), but if people start linking to multiple versions of the content, then you may lose the value of those inbound links on the NOINDEX'ed content. Since it's not showing up in search results, that's less likely (in other words, people are going to be most inclined to link to the canonical version), but it's a consideration.
It's really tough to give a recommendation without understanding the business model, but if you absolutely have to have separate sites and you feel that this content is valuable to the visitors of all three sites, then cross-domain canonical is an option. It's just not risk-free. Personally, I'd probably start with unique content across the three domains, then phase in the most useful pieces as duplicates with canonical. Measure and see how it goes. Don't launch 1,000 duplicates on three sites in one day.
-
Budget not an issue, although skilled labor is.
-
Very helpful, thank you!
There is in fact a legal reason why the sites must be distinct from each other and strong marketing reasons why we do need more than one site.
I should mention that although the pages hosting the shared articles will be 99% identical, each site will have other content distinct from the others.
I am open to dropping my idea to share an article database between the sites and just having unique content on each, although I have to wonder what the use of cross-domain canonical is, if not to support this kind of article syndication.
-
Completely agree with dr Peter. If you really need to separate those domains it should be a really good reason.
In my past I used to have many EMD domain to get easy traffic thanks to the domain name boost in serps and so those sites were ranking without many efforts, but after google heading more towards brands this kind of strategy is really time and money consuming.
It really depends on how much budget you may spend on those sites, but normally consolidating the value in one bigger site is the best way to build a brand and achieve links and ranks nowadays.
-
I tend to agree - you always run the risk with cross-domain canonical that Google might not honor it, and the you've got a major duplicate content problem on your hands.
I think there's a simpler reason, in most cases, though. Three unique sites/brands take 3X (or more, in practice) the time and energy to promote, build links to, build social accounts for, etc. That split effort, especially on the SEO side, can far outweigh the brand benefits, unless you have solid resources to invest (read that "$$$").
To be fair, I don't know your strategy/niche, but I've just found that to be true 95% of the time in these cases. Most of the time, I think building sub-brands on sub-folders within the main site and only having one of each product page is a better bet. The other advantage is that users can see the larger brand (it lends credibility) and can move between brands if one isn't a good match.
The exception would be if there's some clear legal or competitive reason the brands can't be publicly associated. In most cases, though, that's going to come with a lot of headaches.
-
Hi all, I think that your alternatives would be:
- one big site with all the thematics. In that way all users can access all content without leaving the site, no need for noindex no need for canonicals since you won't have dupe content
- three sites with specialized articles in each one. You may change slightly your design to give the user the feeling that the site is different but in the same network. Then you may interlink those sites as useful resources. Not optimal since they'll have a huge interlinking,
- as you said noindex the non canonical article. Remember that the noindex tag will prevent indexation not crawling because google will need to crawl your page to know that it should not index it. So you may add meta "noindex,nocache,follow" in the header and be sure that the juice is still flowing in your site.
-
Hmm, ok that's helpful.
The content would be identical with the possible exceptions of a very slightly different meta title and site footer.
What's my alternative to a setup like this? One site, one brand? Noindex the non-canonical article versions?
What I dislike about noindex is that it means inbound links to the non-canonical article versions bring me no benefit.
-
I believe you are playing with fire here... to me this looks like you are trying to manipulate search engines.
If you read the article About rel="canonical" on Google Webmasters Support, you will see they say rel="canonical" link element is seen as a hint and not an absolute directive
Also in the same article they specify that rel="canonical" should be used on pages with identical content. Are you sure in your case the pages have identical content (per total) or just identical articles?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking all but lost after new content pushed
Okay, I thought I was following best practices. In our industry, electronic hardware, we were ranking well for a particular product line (/spacers) but we wanted to do better. We addressed several concerns that Moz found first; duplicate page titles, lack of meta-descriptions and overall lack of targeted keywords. We also took a new approach to add a better structure to our site. Instead of being presented with a list of part numbers we wanted the user to learn more about our products with content. So we added a /products page with content and a product specific page (/spacers) that is almost a definitive buyers guide. We are attempting to answer the questions that we think our customers find most relevant. Well our customers might find it relevant but Google sure didn't. After our deployment of new content our rankings for targeted keywords in Google fell from 10-15 to 80-95 As an open ended question, could somebody explain to me why our ranks fell off a cliff? Homepage: https://www.lyntron.com
Intermediate & Advanced SEO | | jandk4014
New catalog summary page: https://www.lyntron.com/products
New content with focus to rank high: https://www.lyntron.com/spacers TPdn6ym1 -
Pagination & Canonicals
Hi I've been looking at how we paginate our product pages & have a quick question on canonicals. Is this the right way to display.. Or should the canonical point to the main page http://www.key.co.uk/en/key/euro-containers-stacking-containers, so Google doesn't pick up duplicate meta information? Thanks!
Intermediate & Advanced SEO | | BeckyKey0 -
Switching from HTTP to HTTPS: 301 redirect or keep both & rel canonical?
Hey Mozzers, I'll be moving several sites from HTTP to HTTPS in the coming weeks (same brand, multiple ccTLDs). We'll start on a low traffic site and test it for 2-4 weeks to see the impact before rolling out across all 8 sites. Ideally, I'd like to simply 301 redirect the HTTP version page to the HTTPS version of the page (to get that potential SEO rankings boost). However, I'm concerned about the potential drop in rankings, links and traffic. I'm thinking of alternative ways and so instead of the 301 redirect approach, I would keep both sites live and accessible, and then add rel canonical on the HTTPS pages to point towards HTTP so that Google keeps the current pages/ links/ indexed as they are today (in this case, HTTPS is more UX than for SEO). Has anyone tried the rel canonical approach, and if so, what were the results? Do you recommend it? Also, for those who have implemented HTTPS, how long did it take for Google to index those pages over the older HTTP pages?
Intermediate & Advanced SEO | | Steven_Macdonald0 -
Canonical URL availability
Hi We have a website selling cellphones. They are available in different colors and with various data capacity, which slightly changes the URL. For instance: Black iphone, 16GB: www.site.com/iphone(black,16,000000000010204783).html White iphone, 16GB: www.site.com/iphone(white,16,000000000010204783).html White iphone, 24GB: www.site.com/iphone(white,24,000000000010204783).html Now, the canonical URL indicates a standard URL: But this URL is never physically available. Instead, a user gets 301 redirected to one of the above URLs. Is this a problem? Does a URL have to be "physically" available if it is indicated as canonical?
Intermediate & Advanced SEO | | zeepartner0 -
Canonical and Rel=next/prev Implementation
Hi, I have an ecommerce site that allows users to view numerous pages and sort by a number of options on categories. I've read numerous posts around my issue but am still a little confused on what is best practice with regards to the canonical tag and rel=next and prev. Below is an example of the various page/sort by URL's: Paginated URL: http://www.example.co.uk/category/subcategory.html?p=3 Sort by URL: http://www.example.co.uk/category/subcategory.html?dir=desc&order=price Paginated & Sort by URL: http://www.example.co.uk/category/subcategory.html?dir=desc&order=price&p=3 It is not viable for us to use a canonical tag to the view all page as some of the categories contain a large number of products and therefore would not have the best load speeds. Is it best to use the below structure when it comes to the canonical tag and rel=next and prev? Paginated URL: http://www.example.co.uk/category/subcategory.html?p=3 Sort by URL: http://www.example.co.uk/category/subcategory.html?dir=desc&order=price Paginated & Sort by URL: http://www.example.co.uk/category/subcategory.html?dir=desc&order=price&p=3 http://www.example.co.uk/category/subcategory.html?dir=desc&order=price&p=2" /> Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Can we really learn from the best?
Hi All, When I started my site (an eCommerce site) I copied (or tried) a lot of things from the best eCommerce sites I thought were out there. Sites like Zappos, ZALES, Overstock, BlueNile etc. I got hit pretty hard with latest algo changes and I posted my question at Google Webmaster Help forum I received answers from Gurus that we are keyword stuffing etc. (mainly with internal links to product pages but other issues as well). My answer was a link to Zappos and other sites showing that what we do is nothing compared to them. I also showed dozens of SEO "errors" like using H1 tag 10 times per page, not using canonicals and many other issues. The Guru's answer was "LOL" - who am I to compare myself to Zappos. So the question is... Can we take them for example or are they first simply because they are the biggest?
Intermediate & Advanced SEO | | BeytzNet0 -
What can I do to put these pages back in the top results?
Hello here, here is an interesting question for you. The following 2 webpages from our website have been ranking well on Google (usually on the 1st or 2nd page) for the past 12 years. They are among our oldest, highly relevant product pages on our site: http://www.virtualsheetmusic.com/score/Moonlight.html http://www.virtualsheetmusic.com/score/Eliza.html And we could always find them with the keyword "moonlight sonata sheet music" or "fur elise sheet music". Now, since the last November these pages don't show up anymore despite they are still present in the index. It is pretty hard to understand why those pages don't show up in the search results for those keywords as they used to, above all if you consider that those are among our best, most popular and unique product pages! But instead to struggle to understand why we lost presence (Panda? Some unknown sort of penalization?), has anyone any suggestions to help us to have those pages back in the top results? What do you suggest to do in such kind of cases? Any ideas and thoughts are very welcome! Thank you in advance.
Intermediate & Advanced SEO | | fablau0 -
How can scraper sites be successful post Panda?
I read this article on SEJ: http://www.searchenginejournal.com/scrapers-and-the-panda-update/34192/ And, I'm a bit confused as to how a scraper site can be successful post Panda? Didn't panda specifically target sites that have duplicate content & shouldn't scraper sites actually be suffering?
Intermediate & Advanced SEO | | nicole.healthline0