Competitors and Duplicate Content
-
I'm curious to get people's opinion on this.
One of our clients (Company A) has a competitor that's using duplicate sites to rank. They're using "www.companyA.com" and "www.CompanyAIndustryTown.com" (actually, several of the variations). It's basically duplicate content, with maybe a town name inserted or changed somewhere on the page. I was always told that this is not a wise idea. They started doing this in the past month or so when they had a site redesign. So far, it's working pretty well for them. So, here's my questions:
-Would you address this directly (report to Google, etc.)?
-Would you ignore this?
-Do you think it's going to backfire soon?
There's another company (Company B) that's using another practice- using separate pages on their domain to address different towns, and using those as landing pages. Similar, in that a lot of the content is the same, just some town names and minor details changed. All on the same domain though. Would the same apply to that?
Thanks for your insight!
-
The only long lasting way to rank for local specific pages is to offer truly unique content on those pages, and build unique links to those pages.
The two methods you mentioned here, using near duplicate sites and pages, may work for a short time or in non-competitive niches. It may also work somewhat if a very strong link profile is backing it up... but in general these sorts of tricks usually result in a drop in rankings. If not now, then during an upcoming algorythm change.
Often times, misguided webmasters think they are doing the right thing in launching these sites and pages, and no ill intent is intended. Unless the pages are obviously spam or doorway pages, then in my opinion it's probably not worth it reporting them to Google, but that decision is of course best left to each individual.
Read more about doorway pages: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355
Consider how Yelp has 100s of pages about dentist, at least one page for every major city in America. Although the pages are similar, they are each filled with unique content and all have unique links pointing to them. Each delivers a similar message, but provides unique value based on that particular location.
Add unique value to each location specific page, and you're doing great.
-
Unfortunately, this isn't a method likely to work.
Most of the time, if you insert canonical tags on near similar pages, and Google interprets those canonical correctly, then they tend to index and rank the page that the canonical points to. So all of those other pages would have little or no search engine visibility whatsoever.
Not a good technique if you're trying to rank individual pages.
-
So ARE you suggesting that for local city pages that you add the canonical tag to point to the home page?
I guess I'm a little confused on this as Adam is?
Can you explain your thoughts behind this?
-
So let me clarify then, if they have (on same domain) multiple pages with near duplicate content, mostly changing names of cities, but use rel:canonical, they will still have the SEO benefit of ranking for different towns, but it won't be seen as duplicate content?
And then the multiple domain situation...that's just a wait and see.
-
The pages with the city specific information but similar content are pretty much the perfect space for a canonical tag. If you feel that they haven't been penalized, then this is probably the method they are using for hosting the same content.
-
here is an example of sites that have been using duplicate content with a few word changes
http://www.seomoz.org/q/duplicate-exact-match-domains-flagged-by-google-need-help-reinclusion
-
Having multiple sites with duplicate content is a bad idea as it affects your search engine rankings. The company is likely to be using bad SEO practice and soon google bots will pick this up and the domain will get penalised.
You can report to Google, but in most cases Google picks up sites that are using bad SEO techniques.
There is no harm in using separate pages on domains name to address they operate in different towns as this helps the site being found for local searches, but having content that is again duplicated and only a few words changed Google will pick this up.
Always remember Content is KING!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
What tools do you use to find scraped content?
This hasn’t been an issue for our company so far, but I like to be proactive. What tools do you use to find sites that may have scraped your content? Looking forward to your suggestions. Vic
White Hat / Black Hat SEO | | VicMarcusNWI0 -
Content placement in HTML and display
Does Google penalize for content being placed at the top of the page and display for users at bottom of the page? This technique is done by CSS. Thank you in advance for your feedback!
White Hat / Black Hat SEO | | Aerocasillas0 -
Image Optimization & Duplicate Content Issues
Hello Everyone, I have a new site that we're building which will incorporate some product thumbnail images cut and pasted from other sites and I would like some advice on how to properly manage those images on our site. Here's one sample scenario from the new website: We're building furniture and the client has the option of selecting 50 plastic laminate finish options from the Formica company. We'll cut and paste those 50 thumbnails of the various plastic laminate finishes and incorporate them into our site. Rather than sending our website visitors over to the Formica site, we want them to stay put on our site, and select the finishes from our pages. The borrowed thumbnail images will not represent the majority of the site's content and we have plenty of our own images and original content. As it does not make sense for us to order 50 samples from Formica & photograph them ourselves, what is the best way to handle to issue? Thanks in advance, Scott
White Hat / Black Hat SEO | | ccbamatx0 -
I think competitors are trying to remove my links! Have you ever seen this?
Here is the email my sales rep received today (what can we do to combat this?): From: Jaqueline carol [mailto:jaqueline-carol@hotmail.com]
White Hat / Black Hat SEO | | pbhatt
Sent: Wednesday, September 11, 2013 12:57 AM
To:
Subject: I NEED your help - PLEASE Hi, Due to the latest GoogIe update we are working on cleaning up the links to our website . There fore we would like to kindly ask you to remove our link from your page. link details: URL: We believe that it would help both sides to rank up higher in Google and not get penalized during the future Google updates.
Please remove my link at the earliest and notify me about the same. Thank you for your cooperation. Best Regards, Jaquelinecarol1 -
Article Re-posting / Duplication
Hi Mozzers! Quick question for you all. This is something I've been unsure of for a while. But when a guest post you've written goes live on someone's blog. Is it then okay it post the same article to your own blog as well as Squidoo for example? Would the search engines still see it as duplication if I have a link back to the original?
White Hat / Black Hat SEO | | Webrevolve0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0