Duplicate Content, Same Company?
-
Hello Moz Community,
I am doing work for a company and they have multiple locations.
For example, examplenewyork.com, examplesanfrancisco.com, etc.
They also have the same content on certain pages within each website.
For example, examplenewyork.com/page-a has the same content as examplesanfrancisco.com/page-a
Does this duplicate content negatively impact us? Or could we rank for each page within each location parameter (for example, people in new york search page-a would see our web page and people in san fran search page-a would see our web page)?
I hope this is clear.
Thanks,
Cole
-
Thanks all.
-
Sorry, I lost track of the fact that you were talking about dupe content on multiple domains, vs. on the same domain. The same logic basically applies. However, when you're talking about essentially duplicating entire domains registered to the same owner, there can be somewhat more of a risk that the original content gets discounted (or in such cases, penalized) along with the duplicate.
If you have a main site that seems to be doing OK in the search results, you may consider keeping that domain and it's content, while eliminating/redirecting the other domains and revising their content for use on the domain you're keeping.
-
Chris makes a fantastic point here.
You almost need to detach "what's reasonable" from what Google wants sometimes. Chris is right - why shouldn't those two pages have the same content? But we're dealing with algorithms mainly, not reasoning.
-
Cole,
I'm going to say roughly the same thing as the soon-to-be-guru Tom but give you somewhat of a different spin on it.
It's completely understandable that anyone with a website would feel that the the content applicable to one city would also apply to another city as well, so what's the harm in just switching out the city names? There shouldn't be really, and in most cases there is no actual harm, in it.
However, while Google's search engine makes it possible for customers in multiple cities to actually be able to seek out and find content you've "tailored" to them, it also makes it possible for other marketers to do the same as you've done--thus competition for keywords increases dramatically. On a small scale, google doesn't want to penalize, per se, a whole site for such practices, but it does want to differentiate that which might be original content from that which might be duplicates of the original and in doing so, be able to rank the original, while discounting duplicates.
To get around this "hurdle" you have to treat each of your pages as unique entities with unique values to each of your target markets. That way, content for each page ends up being unique and Google's algorithm can prioritize all the competitors' pages uniformly according to how relevant and valuable they are to the target audience.
-
Hey Cole
-
The more you do change, the less risk involved. Some might tell you that if you change the content enough to pass "copyscape" or other online plagiarism tools, that would protect you from a penalty. I find that to be slightly ridiculous - why would Google judge by those external standards? The more you can change, the better in my opinion (but I can totally sympathise with the work that entails)
-
Google will know you own the websites if you link them together, share GA code, host them together, contain the same company details and so on - but my question is why would you want to do that? I think if you tried to tell Google you owned all the sites they would come out you even harder, as they could see it as you being manipulative.
To that point, others will recommend that you only use one domain and target different KWs or locations on different pages/subfolders/subdomains, as it'll look less like a link network. Downside of that is getting Google local listings for each page/location can be a bit of a pain if the pages all come from one domain.
It's not really my place to comment on your strategy and what you should/should not be doing, but suffice to say if you go with individual domains for each location, you should aim to make those domains (and their copy) as unique and independent as possible.
-
-
Hey Tom,
The keywords we are competing for aren't very competitive.
Two follow up questions:
1.) To what length should we change the content? For example, is it a matter of a few words (location based) or is it more of altering each content on the page. I guess my question deals with the scope of the content change.
2.) Is there a way to let Google know we own all the websites? I had href lang in mind here. This may not be possible; I just wanted to ask.
Tom, thanks so much for your help.
Cole
-
Hi Cole
That kind of duplication will almost certainly negatively impact your ability to rank.
It's the kind of dupe content that Google hates - the kind that's deliberately manipulative and used by sites just trying to rank for as many different KWs or locations as possible, without trying to give people a unique user experience.
Not to say that you couldn't possibly rank like this (I've seen it happen and will probably see it again in the future), but you're leaving yourself wide open to a Panda penalty and, as such, I'd highly recommend that you cater each site and each landing page to your particular audience. Even by doing that, not only will you be making it unique but you would dramatically improve your chances of ranking by mentioning local things for a local page.
Give each page unique copy and really tailor it to your local audience.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate product content/disclaimers for non-e-commerce sites
This is more a follow-up to Rand's recent Whiteboard "Handling User-Generated & Manufacturer-Required Duplicate Content Across Large Numbers of URLs." I posed my question in the comments, but unsure it will get picked up. My situation isn't exactly the same, but it's similar: Our site isn't an e-commerce site and doesn't have user reviews yet, but we do have maybe 8 pages across 2 product categories featuring very similar product features with duplicate verbiage. However, we don't want to re-write it because we want to make it easy for users to compare apples-to-apples to easily see which features are actually different. We also have to run disclaimers at the bottom of each page.\ Would i-framing the product descriptions and disclaimers be beneficial in this scenario, with the addition of good content? It would still be nice to have some crawlable content on those pages, so the i-framing makes me nervous unless we compensate with at least some above-the-fold, useful content that could be indexed. Thanks, Sarah
On-Page Optimization | | sbs2190 -
Links to Paywall from Content Pages
Hi, My site is funded by subscriptions. We offer lengthy excerpts, and then direct people to a single paywall page, something like domain.com/subscribe/ This means that most pages on the site links to /subscribe, including all of the high value pages that bring people in from Google. This is a page with an understandably high bounce rate, as most users are not interested in paying for content on the web. My question is are we being penalized in Google for having so many internal links to a page with a very high bounce rate? If anyone has worked with paywall sites before and knows the best practices for this, I'd be really grateful to learn more.
On-Page Optimization | | enotes0 -
SEO Content Revolution Question
I was wondering if articles written about questions people are asking will help my website rank better. For example let's say I wrote an article answering the query, "What Hair Dye Does Angela Merkel Use?" or, "Is Hillary Clinton Thinking of Running for President," and they rank well on google, and in turn they get viewed a lot by searchers because it answers their queries. Would this help my website as whole start ranking better? Thanks!
On-Page Optimization | | OOMDODigital0 -
Why Moz is showing Duplicate Page Content Issues?
We have a Career Section on our website. For each job post, there is a separate link of "Apply Job". Now Moz's Crawl Diagnostic is showing Duplicate page content for such URLs. Here are two such URLs: http://tiny.cc/em9nyw http://tiny.cc/bq9nyw Can any one please suggest on this? Thanks
On-Page Optimization | | chandman0 -
How to avoid duplicates when URL and content changes during the course of a day?
I'm currently facing the following challenge: Newspaper industry: the content and title of some (featured) articles change a couple of times during a normal day. The CMS is setup so each article can be found by only using it's specific id (eg. domain.tld/123). A normal article looks like this: domain.tld/some-path/sub-path/i-am-the-topic,123 Now the article gets changed and with it the topic. It looks like this now: domain.tld/some-path/sub-path/i-am-the-new-topic,123 I can not tell the writers that they can not change the article as they wish any more. I could implement canonicals pointing to the short url (domain.tld/123). I could try to change the URL's to something like domain.tld/some-path/sub-path/123. Then we would lose keywords in URL (which afaik is not that important as a ranking factor; rather as a CTR factor). If anyone has experiences sharing them would be greatly appreciated. Thanks, Jan
On-Page Optimization | | jmueller0 -
Suggestions to avoid duplicate content
Hi, we have about 6500 products, almost all with descriptions. SEOMOZ is showing about 2500 of them with duplicate content. The reason for this is that only one or two words are different for each product. For example, we have 500 award certificates. All are the same size and have the same description. But one is swimming, one baseball, one reading, etc, etc. Apparently the 1 word difference is not enough to differentiate. We have the same issue with our trophies - they are identical, except for figures. Does anyone have any good tips on how to change the content to avoid this issue and to avoid making up content for 2500 items? Thanks! Neil trophycentral.com
On-Page Optimization | | trophycentraltrophiesandawards0 -
Meta Data definition for multiple pages. Potential duplicate content risk?
Hi all, One of our clients needs to redefine their meta title and description tags. They publish very similar information almost every day, so the structure they propose is the following: Structure 1: Type of Analysis + periodicity + data + brand name Examples 1: Monthly Market Analysis, 1/5/2012 - Brand Name Weekly Technical Analysis, 7/5/2012 - Brand Name Structure 2: Company Name + investment recommendation + periodicity Example 2: Iberdrola + investment recommendation (this text doesn't vary) + 2T12 (wich means 2012, 2nd trimestrer) Regarding meta description they want to follow a similar approach, replicating every time the same info with a slight variation for each publication. I'm afraid this may cause a duplicate content problem because of the resemblance of every "Market Analysis" done or every "Investment recommendation" done in the future. My initial suggestion for them is to define specific and unique meta data for each page, but this is not possible for them given the time it takes to do it for every page. Finally, I ask them to specify the data in each meta title of content published, in order to add something different each time and avoid duplicate content penalty. Will this be enough to avoid duplicate content issues? Thanks in advance for your help folks! Alex
On-Page Optimization | | elisainteractive0 -
Duplicate content issues with products page 1,2,3 and so on
Hi, we have this products page, for example of a landing page:
On-Page Optimization | | Essentia
http://www.redwrappings.com.au/australian-made/gift-ideas and then we have the link to page 2,3,4 and so on:
http://www.redwrappings.com.au/products.php?c=australian-made&p=2
http://www.redwrappings.com.au/products.php?c=australian-made&p=3 In SEOmoz, they are recognized as duplicate page contents.
What would be the best way to solve this problem? One easy way i can think of is to nominate the first landing page to be the 'master' page (http://www.redwrappings.com.au/australian-made/gift-ideas), and add canonical meta links on page 2,3 and so on. Any other suggestions? Thanks 🙂0