How are they avoiding duplicate content?
-
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at:
http://www.worldsoccershop.com/23147.html
http://www.foxsoccershop.com/23147.html
http://www.soccernetstore.com/23147.html
You can see that practically everything is the same including:
-
product URL
-
product title
-
product description
My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
-
-
The answer is right in your question - "runs a number of whitelabel sites". As mentioned, it is largely due to the original publisher publishing the content first and getting indexed - from there, anytime the google bot stumbles across the same content - it will figure out that it has seen the content before, and attribute the ranking to the original. Something that google themselves covered last year here (although more specifically for news at the time).
Duplicate content unfortunately isn't just "not shown" by the search engines (imagine how "clean" the SERPS would be if that were the case!) it's just ranked lower than the original publisher that google is aware of. Occasionally you will get the odd page that will rank from a different domain - but that is usually due to being fresh content, I have seen this myself with my own content being aggregated by a large news site - they might outrank me on occasion for a day on one or two pieces - but my original url comes out on top in the end.
-
They rank as #1 for the relevant terms. It is very clear Google feels they are the original source of the content, and the other sites are duplicates.
I don't have a crystal ball to see the future, but based on current information, the original source site is not suffering in any manner.
-
Interesting feedback - are worldsoccershop (the original source) likely to suffer any penalties as a result of the whitelabel sites carrying the duplicate content?
-
Hey
I just did a search for some phrases I found on one of their product pages and I wrapped up this long query in double quotes.
"Large graffiti print on front that illustrates the club's famous players and history. The traditional blue jersey has gold details including team badge, adidas logo and sponsor design"
the results that are returned shows the worldsoccershop.com result first & second and therefore they seem to be an authority on this product description.
I have a client that is setting up a store to take on some rather big boys like notonthehighstreet.com and in this industry where they have several, established competitors for each product the big authority stores seem to rank for the generic product descriptions with no real issues.
This is ultimately difficult for the smaller stores as whilst they have less resources, pages on my clients site that use these duplicate descriptions are just getting filtered out of the results. We can see this filtering in action with very specific searches like the one above where we get the 'we have filtered out similar results' message in the search results and low and behold, my clients results are in those that are filtered.
So, to answer your original question:
They have not 'coded' anything in a specific way and there is nothing you are missing as such. They are just an authority site and as such are 'getting away with it' - which, for the smaller players, kind of sucks. That said, only the worldofsoccer pages are returned so the other sites could well be filtered out.
Still, as I am coaching our client, see this not as a problem but as an opportunity. By creating unique content, we can hopefully piggy back other more authoritative sites that are all returning an exact same product description and whilst I don't expect us to get 1st place, we can work towards first page and out of that filter.
Duplicate content is a massive problem and on this site we are working on there is one product description that copyscape tells us is on 300 other sites. Google wants to return rich result sets, some shops, some information, some pictures etc and not just 10 sets of the same thing so dare to be different and give them a reason to display your page.
Hope it helps
Marcus -
My question is, why is Google not classing this as duplicate content?
Why do you feel this content has not been flagged as duplicate content?
The reasonable search for these pages is Barcelona Soccer Jersey. Only one of the three sites has results for this term in the top 50, and it is the #1 and #2 results. If this was not duplicate content, you would expect to find the other two sites listed on the first page of google results as well.
The perfect search for the page (very longtail and unrealistic) is Barcelona 11/12 home soccer jersey. For this result, the worldsoccershop.com site ranks as #1 and 3, the foxsoccershop ranks as #8 which is a big drop down considering the content is the same, and the soccernetstore.com site is not in the top 50 results.
The other two sites have clearly been identified as duplicate content or are otherwise being penalized quite severely.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backup Server causing duplicate content flag?
Hi, Google is indexing pages from our backup server. Is this a duplicate content issue? There are essentially two versions of our entire domain indexed by Google. How do people typically handle this? Any thoughts are appreciated. Thanks, Yael
Intermediate & Advanced SEO | | yaelslater0 -
Can I use duplicate content in different US cities without hurting SEO?
So, I have major concerns with this plan. My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service. They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank." My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction. SEO Experts, your help is genuinely appreciated!
Intermediate & Advanced SEO | | MJTrevens1 -
Duplicated content multi language / regional websites
Hi Guys, I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://moz.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research. The case: A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries. They are willing to implement the following changes: - Href lang tags - Possible a Local Phone number - Possible a Local translation of the menu - Language meta tag (for Bing) Optional they are willing to take the following steps: - Crosslinking every page though a language flag or similar navigation in the header. - Invest in gaining local .be backlinks - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant). The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands? Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website. Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example. I would love to hear from you guys. Best regards, Bob van Biezen
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Pagination causing duplicate content problems
Hi The pagination on our website www.offonhols.com is causing duplicate content problems. Is the best solution adding add rel=”prev” / “next# to the hrefs As now the pagination links at the bottom of the page are just http://offonhols.com/default.aspx?dp=1
Intermediate & Advanced SEO | | offonhols
http://offonhols.com/default.aspx?dp=2
http://offonhols.com/default.aspx?dp=3
etc0 -
Does Google View "SRC", "HREF", TITLE and Alt tags as Duplicate Content on Home Page Slider?
Greetings MOZ Community. A keyword matrix was developed by my SEO firm. I am in the process of integrating primary, secondary and terciary phrases into the text and am also sprinkling three or four other terms. Using a keyword density tool (http://www.webconfs.com/keyword-density-checker.php) the results were somewhat unexpected after I optimized. So I then looked at the source code and noticed text from HREF, ALT and SRC tags that may be effecting how Google would interpret text on the page. Our home page (www.nyc-officespace-leader.com) contains a slider with commercial real estate listings. Would Google index the SRC, HREF, TITLE and ALT tags in these slider items? Would this be detrimental to SEO? The code for one listing (and there are 7-8 in the slider) looks like this: | href="http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf" title="Lease a Prestigious Fifth Avenue Office - Manhattan, New York">Class A Fifth Avenue Offices class="blockLeft"><a< p=""></a<> href="http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf" title="Lease a Prestigious Fifth Avenue Office - Manhattan, New York"> src="http://dr0nu3l9a17ym.cloudfront.net/wp-content/uploads/fsrep/houses/125x100/305.jpg" alt="Lease a Prestigious Fifth Avenue Office - Manhattan, New York" width="125" height="94" /> 1,340 Sq. Ft. $5,918 / month Fifth Avenue Midtown / Grand Central <a< p=""></a<> | Could the repetition of the title text ("lease a Prestigious Fifth...") trigger a duplicate content penalty? Should the slider content be blocked or set to no-index by some kind of a Java script? We have worked very hard to optimize the home page so it would be a real shame if through some technical oversight we got hit by a Google Panda penalty. Thanks, Alan Thanks
Intermediate & Advanced SEO | | Kingalan10 -
Partial duplicate content and canonical tags
Hi - I am rebuilding a consumer website, and each product page will contain a unique product image, and a sentence or two about the product (and we tend to use a lot of the same words in different ways across products). I'd like to have a tabbed area below the product info that talks about the overall product line, and this content would be duplicate across all the product pages (a "Why use our products" type of thing). I'd have this duplicate content also living on its own URL's so they can be found alone in the SERP's. Question is, do I need to add the canonical tag to this page, since there's partial duplicate content on the product pages? And if I did that, would my product pages go un-indexed?? I understand how to handle completely duplicated content, it's the partial duplicate that I'm having difficulty figuring out.
Intermediate & Advanced SEO | | Jenny10 -
How to avoid content canibalizm? How do I control which page is the landing page?
Hi All, To clarify my question I will give an example. Let's assume that I have a laptop e-commerce site and that one of my main categories is Samsung Laptops. The category page shows lots of laptops and a small section of text. On the other hand, in my article section I have a HUGE article about Samsung Laptops. If we consider the two word phrases each page is targeting then the answer is the same - Samsung Laptops. On the article i point to the category page using anchor such as "buy samsung laptops" or "samsung laptops" and on the category page (my wishful landing page) I point to the article with "learn about samsung laptops" or "samsung laptops pros and cons". Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0