Duplicate Content, Same Company?
-
Hello Moz Community,
I am doing work for a company and they have multiple locations.
For example, examplenewyork.com, examplesanfrancisco.com, etc.
They also have the same content on certain pages within each website.
For example, examplenewyork.com/page-a has the same content as examplesanfrancisco.com/page-a
Does this duplicate content negatively impact us? Or could we rank for each page within each location parameter (for example, people in new york search page-a would see our web page and people in san fran search page-a would see our web page)?
I hope this is clear.
Thanks,
Cole
-
Thanks all.
-
Sorry, I lost track of the fact that you were talking about dupe content on multiple domains, vs. on the same domain. The same logic basically applies. However, when you're talking about essentially duplicating entire domains registered to the same owner, there can be somewhat more of a risk that the original content gets discounted (or in such cases, penalized) along with the duplicate.
If you have a main site that seems to be doing OK in the search results, you may consider keeping that domain and it's content, while eliminating/redirecting the other domains and revising their content for use on the domain you're keeping.
-
Chris makes a fantastic point here.
You almost need to detach "what's reasonable" from what Google wants sometimes. Chris is right - why shouldn't those two pages have the same content? But we're dealing with algorithms mainly, not reasoning.
-
Cole,
I'm going to say roughly the same thing as the soon-to-be-guru Tom but give you somewhat of a different spin on it.
It's completely understandable that anyone with a website would feel that the the content applicable to one city would also apply to another city as well, so what's the harm in just switching out the city names? There shouldn't be really, and in most cases there is no actual harm, in it.
However, while Google's search engine makes it possible for customers in multiple cities to actually be able to seek out and find content you've "tailored" to them, it also makes it possible for other marketers to do the same as you've done--thus competition for keywords increases dramatically. On a small scale, google doesn't want to penalize, per se, a whole site for such practices, but it does want to differentiate that which might be original content from that which might be duplicates of the original and in doing so, be able to rank the original, while discounting duplicates.
To get around this "hurdle" you have to treat each of your pages as unique entities with unique values to each of your target markets. That way, content for each page ends up being unique and Google's algorithm can prioritize all the competitors' pages uniformly according to how relevant and valuable they are to the target audience.
-
Hey Cole
-
The more you do change, the less risk involved. Some might tell you that if you change the content enough to pass "copyscape" or other online plagiarism tools, that would protect you from a penalty. I find that to be slightly ridiculous - why would Google judge by those external standards? The more you can change, the better in my opinion (but I can totally sympathise with the work that entails)
-
Google will know you own the websites if you link them together, share GA code, host them together, contain the same company details and so on - but my question is why would you want to do that? I think if you tried to tell Google you owned all the sites they would come out you even harder, as they could see it as you being manipulative.
To that point, others will recommend that you only use one domain and target different KWs or locations on different pages/subfolders/subdomains, as it'll look less like a link network. Downside of that is getting Google local listings for each page/location can be a bit of a pain if the pages all come from one domain.
It's not really my place to comment on your strategy and what you should/should not be doing, but suffice to say if you go with individual domains for each location, you should aim to make those domains (and their copy) as unique and independent as possible.
-
-
Hey Tom,
The keywords we are competing for aren't very competitive.
Two follow up questions:
1.) To what length should we change the content? For example, is it a matter of a few words (location based) or is it more of altering each content on the page. I guess my question deals with the scope of the content change.
2.) Is there a way to let Google know we own all the websites? I had href lang in mind here. This may not be possible; I just wanted to ask.
Tom, thanks so much for your help.
Cole
-
Hi Cole
That kind of duplication will almost certainly negatively impact your ability to rank.
It's the kind of dupe content that Google hates - the kind that's deliberately manipulative and used by sites just trying to rank for as many different KWs or locations as possible, without trying to give people a unique user experience.
Not to say that you couldn't possibly rank like this (I've seen it happen and will probably see it again in the future), but you're leaving yourself wide open to a Panda penalty and, as such, I'd highly recommend that you cater each site and each landing page to your particular audience. Even by doing that, not only will you be making it unique but you would dramatically improve your chances of ranking by mentioning local things for a local page.
Give each page unique copy and really tailor it to your local audience.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issues - page content and store URLs
Hi, I'm experiencing some heavy duplicate content Crawl errors on Moz with www.redrockdecals.com and therefore I really need some help. It brings up different connections between products and I'm having a hard time figuring out what it means. It is listing the same products as duplicate content but they have different URL endings. For example:http://www.redrockdecals.com/car-graphics/chevrolet-silverado?___store=nl&___from_store=us
On-Page Optimization | | speedbird1229
&
http://www.redrockdecals.com/car-graphics/chevrolet-silverado?___store=d&___from_store=us It seems like Moz considers the copy-pasted parts in the Full Description (scrolled a bit down on product pages) as Duplicate Content. For example the general text found on this page: http://www.redrockdecals.com/caution-tow-limited-turning-radius-decal Or this page: http://www.redrockdecals.com/if-you-don-t-succeed-first-time-then-skydiving-isn-t-for-you-bumper-sticker I am planning to write new and unique descriptions for all products but what do you suggest - should I either remove the long same descriptions or just shorten them perhaps so they don't outweigh the short but unique descriptions above? I've heard search engines understand that some parts of the page can be same on other pages but I wonder if in my case this has gone too deep... Thanks so much!0 -
Duplicate page content
Hi Crawl errors is showing 2 pages of duplicate content for my clients WordPress site: /news/ & /category/featured/ Yoast is installed so how best to resolve this ? i see that both pages are canonicalised to themselves so presume just need to change the canonical tag on /category/featured/ to reference /news/ ?(since news is the page with higher authority and the main page for showing this info) or is there other way in Yoast or WP to deal with this & prevent from happening again ? Cheers Dan
On-Page Optimization | | Dan-Lawrence0 -
Help With Duplicated Content
Hi Moz Community, I am having some issue's with duplicated content, i recently removed the .html from all of our links and moz has reported it as being duplicated. I have been reading up about Canonicalization and would to verify some details, when using the canonical tag would it be placed in the /mywebpage.html or /mywebpage file? I am having a hard time to sort this out so any help from you SEO experts would be great 🙂 I have also updated my htaccess file with the following Thanks in advance
On-Page Optimization | | finelinewebsolutions0 -
Content Writing for Ecommerce Products
Any idea where I can find content writers / or get content written for my online shop's product descriptions? I need to get a lot of volume done fast. Thanks
On-Page Optimization | | bjs20100 -
A Lot of Duplicate Meta Descriptions
Hi Everyone Its polesandblinds.com newbie here again, I've Just been in Webmaster tools and see my site has: 278 Duplicate Meta Descriptions 13 Long Meta Descriptions 4 Short Meta Descriptions 304 Duplicate Title Tags We are Using Magento version. 1.6.0.0, It seems that our description content which is unique per page (200 to 300 words) is somehow getting put into the meta description, which I confirmed when I viewed the source code, however its not showing in the meta description boxes in magento. I would say its been like this since the re-launch of the site in December 2012. example page http://www.polesandblinds.com/curtain-tracks/ view-source:http://www.polesandblinds.com/curtain-tracks/ My BIG worry is how Google views this and what effects it may have or had on a site. My site has taken quite a big hit on rankings since the Google updates. I'm really looking forward to your responses good or bad. Many Thanks Jonathan
On-Page Optimization | | JonnytheB0 -
Duplicate Content - Deleting Pages
The Penguin update in April 2012 caused my website to lose about 70% of its traffic overnight and as a consequence, the same in volume of sales. Almost a year later I am stil trying to figure out what the problem is with my site. As with many ecommerce sites a large number of the product pages are quite similar. My first crawl with SEOMOZ identified a large number of pages that are very similar - the majority of these are in a category that doesn't sell well anyway and so to help with the problem I am thinking of removing one of my categories (about 1000 products). My question is - would removing all these links boost the overall SEO of the site since I am removing a large chunk of near-duplicate links? Also - if I do remove all these links would I have to put in place a 301 redirect for every single page and if so, what's the quickest way of doing this. My site is www.modern-canvas-art.com Robin
On-Page Optimization | | robbowebbo0 -
Dealing with thin content/95% duplicate content - canonical vs 301 vs noindex
My client's got 14 physical locations around the country but has a webpage for each "service area" they operate in. They have a Croydon location. But a separate page for London, Croydon, Essex, Luton, Stevenage and many other places (areas near Croydon) that the Croydon location serves. Each of these pages is a near duplicate of the Croydon page with the word Croydon swapped for the area. I'm told this was a SEO tactic circa 2001. Obviously this is an issue. So the question - should I 301 redirect each of the links to the Croydon page? Or (what I believe to be the best answer) set a rel=canonical tag on the duplicate pages). Creating "real and meaningful content" on each page isn't quite an option, sorry!
On-Page Optimization | | JamesFx0 -
Offer landing page, duplicate content and noindex
Hi there I'm setting up a landing page for an exclusive offer that is only available (via a link) to a particular audience. Although I've got some specific content (offer informaiton paragraph), i want to use some of the copy and content from one of my standard product pages to inform the visitors about what it is that i'm trying to sell them. Considering I'm going to include a noindex on this page, do i need to worry about it having some content copied directly from another page on my site? Thanks
On-Page Optimization | | zeegirl0