Competitors and Duplicate Content
-
I'm curious to get people's opinion on this.
One of our clients (Company A) has a competitor that's using duplicate sites to rank. They're using "www.companyA.com" and "www.CompanyAIndustryTown.com" (actually, several of the variations). It's basically duplicate content, with maybe a town name inserted or changed somewhere on the page. I was always told that this is not a wise idea. They started doing this in the past month or so when they had a site redesign. So far, it's working pretty well for them. So, here's my questions:
-Would you address this directly (report to Google, etc.)?
-Would you ignore this?
-Do you think it's going to backfire soon?
There's another company (Company B) that's using another practice- using separate pages on their domain to address different towns, and using those as landing pages. Similar, in that a lot of the content is the same, just some town names and minor details changed. All on the same domain though. Would the same apply to that?
Thanks for your insight!
-
The only long lasting way to rank for local specific pages is to offer truly unique content on those pages, and build unique links to those pages.
The two methods you mentioned here, using near duplicate sites and pages, may work for a short time or in non-competitive niches. It may also work somewhat if a very strong link profile is backing it up... but in general these sorts of tricks usually result in a drop in rankings. If not now, then during an upcoming algorythm change.
Often times, misguided webmasters think they are doing the right thing in launching these sites and pages, and no ill intent is intended. Unless the pages are obviously spam or doorway pages, then in my opinion it's probably not worth it reporting them to Google, but that decision is of course best left to each individual.
Read more about doorway pages: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355
Consider how Yelp has 100s of pages about dentist, at least one page for every major city in America. Although the pages are similar, they are each filled with unique content and all have unique links pointing to them. Each delivers a similar message, but provides unique value based on that particular location.
Add unique value to each location specific page, and you're doing great.
-
Unfortunately, this isn't a method likely to work.
Most of the time, if you insert canonical tags on near similar pages, and Google interprets those canonical correctly, then they tend to index and rank the page that the canonical points to. So all of those other pages would have little or no search engine visibility whatsoever.
Not a good technique if you're trying to rank individual pages.
-
So ARE you suggesting that for local city pages that you add the canonical tag to point to the home page?
I guess I'm a little confused on this as Adam is?
Can you explain your thoughts behind this?
-
So let me clarify then, if they have (on same domain) multiple pages with near duplicate content, mostly changing names of cities, but use rel:canonical, they will still have the SEO benefit of ranking for different towns, but it won't be seen as duplicate content?
And then the multiple domain situation...that's just a wait and see.
-
The pages with the city specific information but similar content are pretty much the perfect space for a canonical tag. If you feel that they haven't been penalized, then this is probably the method they are using for hosting the same content.
-
here is an example of sites that have been using duplicate content with a few word changes
http://www.seomoz.org/q/duplicate-exact-match-domains-flagged-by-google-need-help-reinclusion
-
Having multiple sites with duplicate content is a bad idea as it affects your search engine rankings. The company is likely to be using bad SEO practice and soon google bots will pick this up and the domain will get penalised.
You can report to Google, but in most cases Google picks up sites that are using bad SEO techniques.
There is no harm in using separate pages on domains name to address they operate in different towns as this helps the site being found for local searches, but having content that is again duplicated and only a few words changed Google will pick this up.
Always remember Content is KING!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question RE: Links in Headers, Footers, Content, and Navigation
This question is regarding this Whiteboard Friday from October 2017 (https://moz.com/blog/links-headers-footers-navigation-impact-seo). Sorry that I am a little late to the party, but I wanted to see if someone could help out. So, in theory, if header links matter less than in-content links, and links lower on the page have their anchor text value stripped from them, is there any point of linking to an asset in the content that is also in the header other than for user experience (which I understand should be paramount)? Just want to be clear.Also, if in-content links are better than header links, than hypothetically an industry would want to find ways to organically link to landing pages rather than including that landing page in the header, no? Again, this is just for a Google link equity perspective, not a user experience perspective, just trying to wrap my head around the lesson. links-headers-footers-navigation-impact-seo
White Hat / Black Hat SEO | | 3VE0 -
Do we get de-indexed for changing some content and tags frequently? What is the scope in 2017?
Hi all, We are making some changes in our website content at some paragraphs and tags with our main keywords. I'm just wondering if this is going to make us de indexed from Google? Because we recently dropped in rankings when we added some new content; so I am worried whether there are any chances it will turn more risky when we try to make anymore changes like changing the content. There are actually many reasons a website gets de indexed from Google but we don't employ any such black hat techniques. Our website got a reputation with thousands of direct traffic and organic search. However I am curious to know what are the chances of getting de indexed as per the new trends at Google? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
If I am getting links on competitor websites, is it safe to assume those competitors are doing this to hurt our SEO?
We have received a few notification from Google Webmaster Tools and Moz that our competitors have "mentioned" our page on their website. This is incredibly odd as you wouldn't think they'd want to do this. Further, when I go to the page that we are supposedly mentioned on, the link to our site is not on the page. What is going on? Thank you in advance for your insights!!
White Hat / Black Hat SEO | | brits0 -
International web site - duplicate content?
I am looking at a site offering different language options via a javascript drop down chooser. Will google flag this as duplicate content? Should I recommend the purchase of individual domains for each country? i.e. .uk
White Hat / Black Hat SEO | | bakergraphix_yahoo.com1 -
Negative SEO - Case Studies Prove Results. De-rank your competitors
Reading these two articles made me feel sick. People are actually offering a service to de-rank a website. I could have swore I heard Matt Cutts say this was not possible, well the results are in. This really opens up a whole new can of worms for google. http://trafficplanet.com/topic/2369-case-study-negative-seo-results/ http://trafficplanet.com/topic/2372-successful-negative-seo-case-study/ This is only going to get worse as news like this will spread like wildfire. In one sense, its good these people have done this to prove it to google its just a pity they did it on real business's that rely on traffic.
White Hat / Black Hat SEO | | dean19860 -
Shadow Pages for Flash Content
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ... <cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ... <cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
White Hat / Black Hat SEO | | GerryWeitz0 -
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible?
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible? I assumed that all search engines would finde the backlinks. Besides that he ranks fair well and better than I do with only a single site and with only one article of content while I have a lot of content and sites. I do not undersdtand why he is ranking better in google, while google assumingly does not see any backlinks of the 50.000 bing is finding. Thx, Dan
White Hat / Black Hat SEO | | docschmitti0