Duplicate product description ranking problems (off-site duplicate content)
-
We do business in niche category and not in English language market. We have 2-3 main competitors who use same product information as us. They all do have same duplicate products descriptions as we. We with one competitors have domains with highest authority in this market. They maybe have 10-20% better link profile (when counting linking domains and total links).
Problem is that they rank much better with product names then we do (same duplicate product descriptions as we have and almost same level internal optimisation) and they haven't done any extra link building for products. Manufacturers website aren't problem, because these doesn't rank well with product name keywords. Most of our new and some old product go to the Supplemental Results and are shown in "In order to show you the most relevant results, we have omitted some entries very similar to the ... already displayed. If you like, you can repeat the search with the omitted results included.".
Unique text for products isn't a option. When we have writen unique content for product, then these seem to rank way better. So our questions is what can we do externaly to help our duplicate description product rank better compared to our main competitor withour writing unique text?
How important is indexation time? Will it give big advantage to get indexed first? We have thought of using more RSS/bing services to get faster indexation (both site will get products information almost at same time). It seems our competitor get quicker in index then we do.
Also are farmpages helpful for getting some quick low value links for new products. We have planed to make 2-3 domains that would have few links pointint to these new products to get little advantage right after products are launched and doesn't have extranl links.
Sitemap works and our new product are shown on front pages (products that still mostly doesn't rank well and go to Supplemental Results). Some new product have #1 or top3 raking, but these are only maybe 1/3 that should have top3 rankings.
Also we have noticed problem that when we index products quickly (for example Fetch as Google) then these will get good top3 results and then some will get out of rankings (to Supplemental Results).
-
There's no easy answer, I'm afraid, and if an answer looks too easy, I'd stay away from it. Building low-quality links might help in the short-term, but it's too high-risk in the long-term. Plus, if you're combining it with duplicate content, you've got multiple quality issues in play (at least, in Google's eyes - I'm not making a judgment calling about using product descriptions, which is very common).
You say that unique text is proven to have worked, and yet it isn't an option. Why? If it's a matter of time/cost, I'd strongly consider not only the long-term ROI but the possibility of investing selectively. For example, you don't have to write unique text for every product you sell (or re-sell) - you could pick the top 10% of products (which may account for 90% of sales) and start with those. Even the top 1% would be a start. Small investments in the right places could yield large returns here.
The other option that people don't like to hear but really is powerful is to consider more carefully focusing your link equity on a smaller number of products. The more products you list, the more duplicates you have, and some of those products are probably very poor sellers or have very poor profit margins. What if you focused your site architecture on 25% of the total products? You'd focus your authority more and each page would be stronger, relative to your competitors.
One easy win is to make sure you're not dealing with any internal duplicate content (product options pages, search filters, etc.). If you're compounding external duplication with internal duplication, it's only going to make all of your problems worse. The internal duplication is much easier to solve.
-
Thank you for your answer. When comparing DA and PA then ours are little bit better 48 vs 49 (DA), and also our front page PA is better. But actually Open Site Explorer data (DA and PA) isn't really good when we look international market like us. Ahrefs gets better link profiles here. But as we have such a little difference when comparing backlinks then it's little bit strage that they can get so much better results.
It's small international market so customer reviews isn't option. Nobody doesn't give these here. We have reviews possibility already but nobody doesn't submit these.
So also my main qiestions is what factors Google look when they rank same duplicate products. Like we know that they count DA, PA.. and as I understand also who get indexed first. Does anybody know what else?
-
The reason your competitor is ranking better could be the value of their DA and their PA. Without looking specifically it would be hard to say. Google isn't going to show two pages that are exactly the same, which is why they say similar pages have been omitted.
I would not suggest using a link farm. This can only bring you disaster in the long run.
Have you thought about getting customer reviews on page? Using a program that will put customer reviews on page, so that you can see them in the source code is a good way to start leveling the duplicate content out of the equation. You should also put some focus into building quality links. It isn't the quantity of links that you have, but the quality.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Block access to site from everywhere but north america
I have a site that is being attacked very hard by bots, malware, etc. Most of it seems to be originating from Asia and Eastern Europe so I want to block off access to the site to everybody but people in North America. We do not ship out of the country anyways so it really does not need to be seen by people around the world. How can I set this up?
International SEO | | Atomicx0 -
2 Top level domains - not ranking?
Hi Guys I'm a bit confused, I have 3 top level domains com, com.au and co.nz I have set up the right CcLTD's and also the correct Hreflang tags - but for some reason, I'm only been found for my co.nz site and not for the com.au and the com My site is zenory com and zenory.co.nz, zenory.com.au the co.nz is doing well in the nz search but how come I can't find anything for the other two? Is there something I'm doing wrong here?
International SEO | | edward-may0 -
Alternate Hreflang Problem
We have two travel websites.
International SEO | | Izzet
One in English language for people living in the UK.
One in Turkish language for people living in Turkey. The major difference is:
English (UK) website shows 4+ nights accomodation prices because UK travellers never come for less than 4 nights.
Turkish website shows 1-night, 2-night, 3-night prices because Turkish travellers never stay for more than 3 nights. We are using rel="alternate" hreflang="x" tags properly on our two websites. Today, I am disappointed to see Google display the wrong result. When a user in Turkey searches a Turkish keyword on Google.com.tr;
Google is showing the English language website. When I click on Search Settings > Language;
I see that English is selected under this question:
"Which language should Google products use?" This is a big problem for us.
Many rich users in Turkey, who are more willing to buy our services, speak English fluently and they may choose to use Gmail in English. But we are losing business because these Turkish customers don't convert at all on the Enlish (UK) website because of the reason I explained above. 1) What can we do?
2) If we remove the rel="alternate" hreflang="x" tags now, will it hurt any of the websites?
We have seen an increase in Google rankings for the Turkish language website after using rel="alternate" hreflang="x" tags. Izzet0 -
Human Translation versus Google Translate for Ecommerce Products
Hi all, We want to put our products on our ecommerce site into another language. I have always been under the impression that running text through Google Translate is a no no, not only for the user experience, but also it is a Google tool and I am assuming that Google would notice that it is not translated by a human. I don't know if it would incur a penalty as such but it most likely would not be favoured as a human translation Can anyone confirm their experience or impression on this? Thanks!
International SEO | | bjs20100 -
Different Home Sites for different Countries but same Language
We'r starting a new webshop soon and and one of our programmers came up with the following: Different Home Sites (Index Pages) for Austria and Germany. The Language is both times German but some words are different than others. The customer would like to have that. So we would have: domain.com (No Austrian or German IP Address) domain.com/at/ (User with Austrian IP Adress) domain.com/de/ (User with German IP Address) Is this SEO wise a disadvantage? How to set up the canonicals? DE & AT Page with the Canonical on the main Domain? Any advice? Thank you
International SEO | | leitpix0 -
A google ranking problem
HI, Thought I would try SEOMoz out for a while and I am going to dive straight into my initial problem. I have my client moving up nicely in local Google search within my country but the problem is he seems to have actually slipped down a place when I search on google.com/ncr. (and on Rank Tracker) He does sell locally but is a worldwide supplier of his product and wants to obviously rank well everywhere. I understand why he has moved up locally but don't get the shift down internationally. So, 1. If everyone's Google is forcing local search results on everyone, how do you market better internationally. 2. Noticed that his competitors are all getting clever and now using keywords in their URL. Two guys are new on first page Google just because they have Keyworded urls. (killing me). The one guy has one backlink!! (although the client does have a keyworded URL stashed away but we don't want to got that route because we like to keep things 'white hat') The client is happy because when he searches he see's his site moving up nicely but I have a conscience and I am going to have to tell him whats going on. (that his Google is showing local results) I hope that all makes sense. Thanks Mike
International SEO | | MIkeCape0 -
International Hub site: .uk vs domain vs subdomain
Financial company with 2 sites: 1- Mybrand.com for the US market.
International SEO | | FXDD
2- global.mybrand.com is the hub for international with selection for 10 languages: drop-down allows selecting between mybrand.jp, mybrand.fr, etc Now we have the opportunity to redesign the site from zero and I am exploring to get rid of the subdomain for the global site What would be your preference to use as the international hub? a) mybrand.co.uk: I have to use lawyers to get the URL from squatter b) mybrandGlobal.com : URL easy to get, and can be geo targeted using google webmaster tools. Cons: It might not rank as well as .co.uk in the UK, which is our biggest market c) global.mybrand.com-- pros: keep using it because it is aged and has some authority. Google might now see subdomains as part of TLD, thus making it a valid way to separate international from US .. Cons: SEO best practices advice to avoid subdomains because it might not pass full link value across domains. There is not really different content the subdomain, it is just the hub for international Thanks in advance for the help0 -
Multi-lingual SEO: Country-specific TLD's, or migration to a huge .com site?
Dear SEOmoz team, I’m an in-house SEO looking after a number of sites in a competitive vertical. Right now we have our core example.com site translated into over thirty different languages, with each one sitting on its own country-specific TLD (so example.de, example.jp, example.es, example.co.kr etc…). Though we’re using a template system so that changes to the .com domain propagate across all languages, over the years things have become more complex in quite a few areas. For example, the level of analytics script hacks and filters we have created in order to channel users through to each language profile is now bordering on the epic. For a number of reasons we’ve recently been discussing the cost/benefit of migrating all of these languages into the single example.com domain. On first look this would appear to simplify things greatly; however I’m nervous about what effect this would have on our organic SE traffic. All these separate sites have cumulatively received years of on/off-site work, and even if we went through the process of setting up page-for-page redirects to their new home on example.com, I would hate to lose all this hard-work (and business) if we saw our rankings tank as a result of the move. So I guess the question is, for an international business such as ours, which is the optimal site structure in the eyes of the search engines; Local sites on local TLD’s, or one mammoth site with language identifiers in the URL path (or subdomains)? Is Google still so reliant on TLD for geo targeting search results, or is it less of a factor in today’s search engine environment? Cheers!
International SEO | | linklater0