Secretly back-linking from whitelabel product
-
Lets say a company (provider.com) offers a whitelabel solution which enables each client to have all of the content on their own domain (product.client.com), with no branding by the content provider.
Now lets say that client.com is a site with a lot of authority, and to promote the launch of product.client.com, they put a lot of links from their main site to the subdomain. This can be very valuable link juice, and provider.com would like to be able to take advantage. The problem is, that client.com wouldn't like it if provider.com put in links on their whitelabel site.
Suppose the following:
All pages on product.client.com start to have a rel="canonical" link to themselves, with a get variable (e.g. product.client.com/page.htm -> product.client.com/page.html?show_extra_link=true)
When the page is visited with the extra get parameter "show_extra_link" a link appears in the footer that points to provider.com
My question is, would this have the same effect for provider.com as placing a link on the non-canonical version of the pages on the whitelabel site would?
-
I'm with Alan - in theory, the canonical would pass the link-juice to the version with the link, but you're not only misleading the client - you're one step away from cloaking the link. You could actually get your own clients penalized for this, and that seems very short-sighted.
Add the NOINDEX on top of this, and I'd be willing to bet that the value of these links would be very low. Even if the client approved followed white-label pages with footer links, for example, we're seeing those types of links get devalued - they're just too easy to get. Now, you add these links all at once, NOINDEX the page, and canonical to a weird variant, and you've painted a very suspicious picture for Google. It might work for a while, but you're taking a significant risk for potentially a very small gain.
-
i would say the canonical.
if the pages are not indexed, but follow, then they would have no value themselfs unless they had in-coming links. if they do have in-coming links then yes they will pass link juice, but only from the canonical i would think, based one what i said above about a canonical being much like a 301
-
Hi Alan,
All of the pages on the subdomain have a robots meta with noindex, follow on them. The pages are only used for data collection (forms), and the clients do not want their pages showing up in google, which is why extracting link juice shouldn't be a problem. As such, the canonical url need not be indexed.
From what I understand, if a page has duplicate content and specifies a rel=canonical, url, the inbound link juice effectively gets syphoned into the original content page. What I'm wondering is, which page does google use for the purpose of propagating outbound link juice?
-
With prev next the content of every page is given to page 1, in that case the link would be part of the content. But with a canonical I am not sure.
If you go by comments by Matt Cutts and Bings Duane Forrester canonicals are the same as a 301 execpt they dod not pyhsiclly move the viewer to the canonical page. so in the case of a canonical the content would not be merged, only the content on the canonical page would be indexed, the links from other verrsions would be redirected. so the link on the show_extra_link version of the page would not be indexed.
As for the morality of this, i would not do it, you are not being honet with the clint and you would be caught out sooner or later when the url was seen in the index(if it was indexed)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rank drop after link reclamation
Link reclamation is good activity interms of technical SEO and UX. But I noticed couple of times rank drop post the link reclamation activity. Why does this happen? What might be the cause? Beside redirecting to the most relevant page in contest to the source page content; anything else we must be looking into?
White Hat / Black Hat SEO | | vtmoz0 -
Links Identified in WMT not on Webpages
Hi, We're currently reviewing one of our clients backlinks in Google Webmaster Tools, Majestic & OSE as we can see many toxic links. However we cannot find the links on the webpages that are listed on Google WMT. We have searched through the website along with checking through the source code. Should we still disavow the domain? Thanks, Edd
White Hat / Black Hat SEO | | tomcraig860 -
Soliciting Product Reviews with Free Samples?
I have been looking at my competitors links and I have discovered that a competitor with top positions in the SERPs has been gaining links by offering free product samples to bloggers in exchange for links within the review back to their site. My question is, does Google frown on this? Can it invoke a penalty? To me it seems tantamount to buying links, but yet his results speak for themselves. It is something I intend to start doing myself if I am sure it won't result in a penalty. Thanks.
White Hat / Black Hat SEO | | RocketBanner0 -
Do I need to undo a 301 redirect to dissavow links from the source domain?
A client came to me after being hit by Penguin and had already performed a 301 redirect from site A to Site B. Site B was subsequently hit by the penalty a number of weeks later and we are planing on performing link removal for Site A. Only the webmaster tools account for Site B exists, none is still available for site A. I assume that I cannot dissavow links to site A from Site B's webmaster tool account (even though website A's links show up in the GWT account). So do I need to undo the 301 and then create a new GWT account for site A in order to disavow the links pointing to site A, or can I submit from Site B's GWT account since they are 301'd to site B? Thanks! Chris [edited for formatting]
White Hat / Black Hat SEO | | SEOdub0 -
Is there a paid link hierarchy?
It seems like the more I learn about my competition's links, the less I understand about the penalties associated with paid links. Martindale-hubbard (in my industry) basically sells links to every lawyer out there, but none of the websites with those links are penalized. I'm sure you all have services like that in your various industries. Granted, Martindale-hubbard is involved in the legal community and it's tied to Lexis Nexis, but any small amount of research would tell you that paid links are a part of their service. Why does this company (and companies that use them) not get penalized? Did the penguin update just go after companies that got links from really seedy, foreign companies with gambling/porn/medication link profiles? I keep reading on this forum and other places that paid links are bad, but it looks to me like there are fundamental differences in the penalties for paid links purchased from one company vs another. Is that the case or am I missing something? Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Advice on using the disavow tool to remove hacked website links
Hey Everyone, Back in December, our website suffered an attack which created links to other hacked webistes which anchor text such as "This is an excellent time to discuss symptoms, fa" "Open to members of the nursing/paramedical profes" "The organs in the female reproductive system incl" The links were only visible when looking at the Cache of the page. We got these links removed and removed all traces of the attack such as pages which were created in their own directory on our server 3 months later I'm finding websites linking to us with similar anchor text to the ones above, however they're linking to the pages that were created on our server when we were attacked and they've been removed. So one of my questions is does this effect our site? We've seen some of our best performing keywords drop over the last few months and I have a feeling it's due to these spammy links. Here's a website that links to us <colgroup><col width="751"></colgroup>
White Hat / Black Hat SEO | | blagger
| http://www.fashion-game.com/extreme/blog/page-9 | If you do view source or look at the cached version then you'll find a link right at the bottom left corner. We have 268 of these links from 200 domains. Contacting these sites to have these links removed would be a very long process as most of them probably have no idea that those links even exist and I don't have the time to explain to each one how to remove the hacked files etc. I've been looking at using the Google Disavow tool to solve this problem but I'm not sure if it's a good idea or not. We haven't had any warnings from Google about our site being spam or having too many spam links, so do we need to use the tool? Any advice would be very much appreciated. Let me know if you require more details about our problem. <colgroup><col width="355"></colgroup>
| | | |0 -
Best Link Building Practices to Avoid Over Optimizing
With all the new over opting talk, one of the things mentioned is having the same anchored text linking to a page over and over without variation. Is there a good estimate on how many external linking in keywords should be exact versus how many should be in variation? Also, keeping value of pages links in mind. Would it be best to use [Exact] phrase for the higher PR sites or more relevant higher traffic sites? and save the long tail or keyword variation text for the lesser valued sites. When to use exact phrase and when to long tail is my question/discussion I always stay relevant in my link building, and all my links are liking within context. Because I know that relevancy has been an important factor. After watching this video from Matt Cutt's http://youtu.be/KyCYyoGusqs I assume relevancy is becoming even more of an important factor.
White Hat / Black Hat SEO | | SEODinosaur0 -
Single Domain With Different Pages Deep Linking To Different Pages On External Domain
I've been partaking in an extensive trial study and will be releasing the results soon, however I do have quite a strong indication to the answer to this question but would like to see what everyone else thinks first, to see where the common industry mindset is at. Let's say SiteA.com/page1.html is PR5 and links out to SiteB.com/page1.html This of course would count as a valuable backlink. Now, what would happen if SiteA.com/page2.html, which is also PR5, links out to SiteB.com/page2.html ? The link from SiteA is coming from a different page, and is also pointing to a different deeplink on SiteB, however it will contain the same IP address. What would the benefit be for having multiple deeplinks in this way (as outlined above, please read it carefully before responding) as opposed to having just a single deeplink from the domain? If a benefit does exist, then does the benefit start to become trivial? This has nothing to do with sitewide links. Serious answers only please.
White Hat / Black Hat SEO | | stevenheron1