Basically duplicate sites that act like they're two different businesses. How do they not get dinged?
-
I bought supplies recently at barcodesinc.com. While searching I noticed it is clearly the same site as barcodediscount.com. How do they not get hurt by duplicate content?
-
Duplicate content is no longer a penalty reason - google will only decide who of them is the original source and list in the search engine results the appropriate source.
Sometimes however the one that in fact is the one that did copied the content is getting listed first and the original source is getting down in search results.
-
A lot of affiliate sites sell the same stuff but the sites themselves aren't really the same.
Each product has a unique description (as far as I've looked) and realistically that's the only way affiliate sites differentiate themselves.
I agree it's possible that these sites may be run by the same person or company but a quick look through the sites tells me they are both pretty much legit.
Shopping sites like this seem to do quite well even when using the manufacturer description. If the rest of their SEO is good then I'm guessing Google understands the type of sites they are and doesn't penalise them harshly. Remember duplicate content isn't as big a problem as it was once made out to be.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the process for allowing someone to publish a blog post on another site? (duplicate content issue?)
I have a client who allowed a related business to use a blog post from my clients site and reposted to the related businesses site. The problem is the post was copied word for word. There is an introduction and a link back to the website but not to the post itself. I now manage the related business as well. So I have creative control over both websites as well as SEO duties. What is the best practice for this type of blog post syndication? Can the content appear on both sites?
Technical SEO | | donsilvernail0 -
Blog page won't get indexed
Hi Guys, I'm currently asked to work on a website. I noticed that the blog posts won't get indexed in Google. www.domain.com/blog does get indexed but the blogposts itself won't. They have been online for over 2 months now. I found this in the robots.txt file: Allow: / Disallow: /kitchenhandle/ Disallow: /blog/comments/ Disallow: /blog/author/ Disallow: /blog/homepage/feed/ I'm guessing that the last line causes this issue. Does anyone have an idea if this is the case and why they would include this in the robots.txt? Cheers!
Technical SEO | | Happy-SEO2 -
SEMRush's Site Audit Tool "SEO Ideas"
Recently SEMRush added a feature to its site audit tool called "SEO Ideas." In the case of specific the site I'm looking at it with, it's ideas consist mostly of suggesting words to add to the page for the page/my phrase(s) to perform better. It suggests this even when the term(s) or phrases(s) it's looking at are #1. Has anybody used this tool for this or something similar and found it to be valuable and if so how valuable? The reason I ask is that it would be a fair amount of work to go through these pages and find ways to add the select words and phrases and, frankly, it feels kind of 2005 to me. Your thoughts? Thanks... Darcy
Technical SEO | | 945010 -
Why Can't I Get on Google?
I've employed many of the suggestions of SEOMoz and getting a Grade "A" on a particular keyword. I'm now #4 on Yahoo and Bing. However, my site hasn't cracked the top 50 in Google. Why? I see a similar pattern with other keywords, many on yahoo and bing but only a few of my subpages get #45-48 on Google. Any ideas? http://www.gospelebooks.net
Technical SEO | | mrjgardiner0 -
Geotargeting duplicate content to different regions - href and canonical tag confusion
If you duplicate content onto a sub-folder for say a new US geotargeted site (to target kw spelling differences) and, in addition to GWT geotargeting settings, implement the 'Canonical' and 'Hreflang' tags on these new pages to show G different region and language version (en-us). Then does the original/main site similar pages also need to have canonical and href tags ? The main/original sites page I don't really want to target a specific country (although existing signals (hosting etc) will be UK (primary target of main site) but pages show up in other country searches too (which we want). Im presuming fine to leave the original/main site as it currently is although wording in google blog/webmaster central articles etc are a bit confusing hence why im asking for anyone elses opinion/input on this. Also is there are any benefit (or just best practice) to use 'www.example.com/en-us/...' in the subdirectory URL as opposed to just 'www.example.com/us/' many thanks in advance to any commentators 🙂
Technical SEO | | Dan-Lawrence0 -
Cant get my head around this duplicate content dilemma!
Hi, Lets say you have a cleaning company, you have a services page, which covers window cleaning, carpet cleaning etc, lets say the content on this page adds up to around 750 words. Now lets say you would like to create new pages which targeted location specific keywords in your area. The easiest way would be to copy the services page and just change all the tags to the location specific term but now you have duplicate content. If I wanted to target 10 locations, does this now mean I need to generate 750 words of unique content for each page which is basically the services page rewritten? Cheers
Technical SEO | | activitysuper0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
Purchasing a site for a 301 Re-direct
Hi Mozzers, I have a question regarding a tactic I'm considering for a client. My client has a web hosting company and is ranking well for his keywords and in position 3 for his main term. There is a site available on flippa that is a keyword rich domain and has a decent link portfolio and domain authority and the price is attractive. I'm considering buying it to 301 it to his domain but I've never done this tactic before. Is this grey/black hat? Has anyone done this before and to what extent did it work? Thanks Bush
Technical SEO | | Bush_JSM0