Duplicate content, website authority and affiliates
-
We've got a dilemma at the moment with the content we supply to an affiliate. We currently supply the affiliate with our product database which includes everything about a product including the price, title, description and images. The affiliate then lists the products on their website and provides a Commission Junction link back to our ecommerce store which tracks any purchases with the affiliate getting a commission based on any sales via a cookie.
This has been very successful for us in terms of sales but we've noticed a significant dip over the past year in ranking whilst the affiliate has achieved a peak...all eyes are pointing towards the Panda update.
Whenever I type one of our 'uniquely written' product descriptions into Google, the affiliate website appears higher than ours suggesting Google has ranked them the authority.
My question is, without writing unique content for the affiliate and changing the commission junction link. What would be the best option to be recognised as the authority of the content which we wrote in the first place? It always appears on our website first but Google seems to position the affiliate higher than us in the SERPS after a few weeks. The commission junction link is written like this:
-
It seems like maybe we're getting off topic. Why does the affiliate have to suffer in order for the merchant to succeed and vice versa? The real problem here seems to be that you are giving your affiliates the same content you use on your own site. Either make them write their own content or change what's on your site and feed them the old content. It is more work, but you could start slowly by writing fresh (exclusive) content on your site for the most important products. This would give you the ability to test it out unless there is some site-wide (e.g. Panda) issue going on .
As both an affiliate and a merchant, I've always found it best if each has their own content. For one thing, the affiliate site sits earlier in the funnel so it would make sense that they wouldn't be using the same message as the merchant sites product detail page, which is about as far into the funnel as you can get without being inside a shopping cart.
If you are unwilling to do this I think EGOL said it best:
"However, if your rankings are falling it could be competitors (and your good affiliates) are working harder than you."
If you really want to be seen as the authoritative version when there are multiple sites with the same content, the biggest factor in my experience is simply links. Domain authority plays a role too, but a couple of deep links into your product page will make all the difference. I presented about some ways to get links into product and category pages at SMX West, 2012 (link to presentation) and also wrote a blog post about it here on SEOMoz. Hopefully that will help you get started, but there's no easy, scaleable way to do this that isn't a little bit on the gray side. To do it "right" in Google's eyes just takes a lot of elbow grease.
One last thing. As an affilaite there is no way I would agree to putting a cross domain rel canonical or rel author tag on my site that points to the merchant's site. You would lose any affiliate worth their weight in salt that way.
To sum things up for Gavin, here are your two options as I see it, but they aren't mutually exclusive:
1. Rewrite your descriptions and either give the old descriptions to affiliates (e.g. have a database with two different descriptions for every product) or stop giving descriptions to affiliates and make them write their own.2. Build more external links into your product pages.
-
I wouldn't do that, because it might take traffic away from the affiliate, but the main site may not generate enough extra income to recover from the shortfall.
There is no point in ranking higher than the affiliate, if you destroy the affiliate's business, but do not recover it all yourself.
This is the age-old sales problem. Many companies, when they have a great salesman, who always exceeds his targets, change the quotas, to make it harder for the salesman to hit his number, so he works harder, the company makes more and he makes less. Eventually, the salesman will stop working so hard, and then look for another job.
Good salesmen and good affiliates are definitely not a dime-a-dozen. Look after them.
-
Whenever I type one of our 'uniquely written' product descriptions into Google, the affiliate website appears higher than ours suggesting Google has ranked them the authority.
You are lucky to have a powerful affiliate selling your merchandise instead of your competitor's merchandise.
I am an affiliate of a couple of programs and my site always ranks above the program site and this is good for me and good for them because I can defeat competitors that they can not.
There are a few issues to think about related to the duplicate content.....
-
The affiliate might rank above you for quotes from the descriptions, however how do they rank for important keywords that have high search volumes and conversion rates? I would guess that is where the money is being made. If you are above them there then not so many worries. But if they are outranking you there then they are an important rainmaker for your business.
-
This duplicate content could be causing Panda problems for your site, especially if many other affiliates are using it. Some of the sites publishing it are likely to be demoted in the google rankings. However, if your rankings are falling it could be competitors (and your good affiliates) are working harder than you. Consider how much you have invested in making good rankings... if not a lot then your affiliates are fighting that battle for you.
-
The affiliate programs that I sell for have enjoyed the sales that I have produced for over ten years. We have a great relationship and they are fairly confident that I am not going to leave them for a competitor program. They have the best product in the niche and they pay me well. I am one of many affiliates for their company who have been with them for a long time. So, their attitude is.... let the affiliates to the SEO and the PPC... that is what they are good at. We are good at making a great product and servicing customers... this is win-win.
They know something else that is very important. They know that they are THE BRAND and lots of the customers that I refer may make purchases in the future that no commissions are paid on. That is where they win big time.
So, keep your affiliates happy. Over the long term they will be responsible for a LOT of your best repeat customers (if you have a repeat type of business and treat the customer right).
Good affiliates are also really smart about SEO and converting customers. They might know more than you. So, if you ask them to do rel=canonical or some other trick that works to your advantage they might jump to another program or simply become retailers instead of affiliates. I have done that a couple of times.
-
-
You can do a rel=author from the affiliate site to the original content. This recent post has some information that can help: http://www.seomoz.org/blog/authorship-google-plus-link-building
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexed "Lorem Ipsum" content on an unfinished website
Hi guys. So I recently created a new WordPress site and started developing the homepage. I completely forgot to disallow robots to prevent Google from indexing it and the homepage of my site got quickly indexed with all the Lorem ipsum and some plagiarized content from sites of my competitors. What do I do now? I’m afraid that this might spoil my SEO strategy and devalue my site in the eyes of Google from the very beginning. Should I ask Google to remove the homepage using the removal tool in Google Webmaster Tools and ask it to recrawl the page after adding the unique content? Thank you so much for your replies.
Intermediate & Advanced SEO | | Ibis150 -
Delay to rank for authority website versus new website
Hello, I have a website that has been existing for years. How long does it take if I have a good content on a page for it to rank ? I read here and that that it can take 4 to 6 months but it never says if it is for a brand new website or a old website that has an authority and some links. I also read that some people publish content and rank within a week on competitive keywords. So who is right, what is there to read in between the lines ?
Intermediate & Advanced SEO | | seoanalytics0 -
Duplicate Content That Isn't Duplicated
In Moz, I am receiving multiple messages saying that there is duplicate page content on my website. For example, these pages are being highlighted as duplicated: https://www.ohpopsi.com/photo-wallpaper/made-to-measure/pop-art-graffiti/farm-with-barn-and-animals-wall-mural-3824 and https://www.ohpopsi.com/photo-wallpaper/made-to-measure/animals-wildlife/little-elephants-garden-seamless-pattern-wall-mural-3614. As you can see, both pages are different products, therefore I can't apply a 301 redirect or canonical tag. What do you suggest?
Intermediate & Advanced SEO | | e3creative0 -
Semi-duplicate content yet authoritative site
So I have 5 real estate sites. One of those sites is of course the original, and it has more/better content on most of the pages than the other sites. I used to be top ranked for all of the subdivsion names in my town. Then when I did the next 2-4 sites, I had some sites doing better than others for certain keywords, and then I have 3 of those sites that are basically the same URL structures (besides the actual domain) and they aren't getting fed very many visits. I have a couple of agents that work with me that I loaned my sites to to see if that would help since it would be a different name. My same youtube video is on each of the respective subdivision pages of my site and theirs. Also, their content is just rewritten content from mine about the same length of content. I have looked over and seen a few of my competitors who only have one site and their URL structures arent good at all, and their content isn't good at all and a good bit of their pages rank higher than my main site which is very frustrating to say the least since they are actually copy cats to my site. I sort of started the precedent of content, mapping the neighborhood, how far that subdivision is from certain landmarks, and then shot a video of each. They have pretty much done the same thing and are now ahead of me. What sort of advice could you give me? Right now, I have two sites that are almost duplicate in terms of a template and same subdivsions although I did change the content the best I could, and that site is still getting pretty good visits. I originally did it to try and dominate the first page of the SERPS and then Penguin and Panda came out and seemed to figure that game out. So now, I would still like to keep all the sites, but I'm assuming that would entail making them all unique, which seems to be tough seeing as though my town has the same subdivisions. Curious as to what the suggestions would be, as I have put a lot of time into these sites. If I post my site will it show up in the SERPS? Thanks in advance
Intermediate & Advanced SEO | | Veebs0 -
How to Set Up Canonical Tags to Eliminate Duplicate Content Error
Google Webmaster Tools under HTML improvements is showing duplicate meta descriptions for 2 similar pages. The 2 pages are for building address. The URL has several pages because there are multiple property listings for this building. The URLs in question are: www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan/page/3 www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan How do I correct this error using canonical tags? Do I enter the URL of the 1<sup>st</sup> page under “Canonical URL” under “Advanced” to show Google that these pages are one and the same? If so, do I enter the entire URL into this field (www.metro-manhattan.com /601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan) or an abbreviated version (/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan)? Please see attached images. Thanks!! Alan rUspIzk 34aSQ7k
Intermediate & Advanced SEO | | Kingalan10 -
Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan
Intermediate & Advanced SEO | | Chemometec0 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
PDF on financial site that duplicates ~50% of site content
I have a financial advisor client who has a downloadable PDF on his site that contains about 9 pages of good info. Problem is much of the content can also be found on individual pages of his site. Is it best to noindex/follow the pdf? It would be great to let the few pages of original content be crawlable, but I'm concerned about the duplicate content aspect. Thanks --
Intermediate & Advanced SEO | | 540SEO0