Duplicate Content, Same Company?
-
Hello Moz Community,
I am doing work for a company and they have multiple locations.
For example, examplenewyork.com, examplesanfrancisco.com, etc.
They also have the same content on certain pages within each website.
For example, examplenewyork.com/page-a has the same content as examplesanfrancisco.com/page-a
Does this duplicate content negatively impact us? Or could we rank for each page within each location parameter (for example, people in new york search page-a would see our web page and people in san fran search page-a would see our web page)?
I hope this is clear.
Thanks,
Cole
-
Thanks all.
-
Sorry, I lost track of the fact that you were talking about dupe content on multiple domains, vs. on the same domain. The same logic basically applies. However, when you're talking about essentially duplicating entire domains registered to the same owner, there can be somewhat more of a risk that the original content gets discounted (or in such cases, penalized) along with the duplicate.
If you have a main site that seems to be doing OK in the search results, you may consider keeping that domain and it's content, while eliminating/redirecting the other domains and revising their content for use on the domain you're keeping.
-
Chris makes a fantastic point here.
You almost need to detach "what's reasonable" from what Google wants sometimes. Chris is right - why shouldn't those two pages have the same content? But we're dealing with algorithms mainly, not reasoning.
-
Cole,
I'm going to say roughly the same thing as the soon-to-be-guru Tom but give you somewhat of a different spin on it.
It's completely understandable that anyone with a website would feel that the the content applicable to one city would also apply to another city as well, so what's the harm in just switching out the city names? There shouldn't be really, and in most cases there is no actual harm, in it.
However, while Google's search engine makes it possible for customers in multiple cities to actually be able to seek out and find content you've "tailored" to them, it also makes it possible for other marketers to do the same as you've done--thus competition for keywords increases dramatically. On a small scale, google doesn't want to penalize, per se, a whole site for such practices, but it does want to differentiate that which might be original content from that which might be duplicates of the original and in doing so, be able to rank the original, while discounting duplicates.
To get around this "hurdle" you have to treat each of your pages as unique entities with unique values to each of your target markets. That way, content for each page ends up being unique and Google's algorithm can prioritize all the competitors' pages uniformly according to how relevant and valuable they are to the target audience.
-
Hey Cole
-
The more you do change, the less risk involved. Some might tell you that if you change the content enough to pass "copyscape" or other online plagiarism tools, that would protect you from a penalty. I find that to be slightly ridiculous - why would Google judge by those external standards? The more you can change, the better in my opinion (but I can totally sympathise with the work that entails)
-
Google will know you own the websites if you link them together, share GA code, host them together, contain the same company details and so on - but my question is why would you want to do that? I think if you tried to tell Google you owned all the sites they would come out you even harder, as they could see it as you being manipulative.
To that point, others will recommend that you only use one domain and target different KWs or locations on different pages/subfolders/subdomains, as it'll look less like a link network. Downside of that is getting Google local listings for each page/location can be a bit of a pain if the pages all come from one domain.
It's not really my place to comment on your strategy and what you should/should not be doing, but suffice to say if you go with individual domains for each location, you should aim to make those domains (and their copy) as unique and independent as possible.
-
-
Hey Tom,
The keywords we are competing for aren't very competitive.
Two follow up questions:
1.) To what length should we change the content? For example, is it a matter of a few words (location based) or is it more of altering each content on the page. I guess my question deals with the scope of the content change.
2.) Is there a way to let Google know we own all the websites? I had href lang in mind here. This may not be possible; I just wanted to ask.
Tom, thanks so much for your help.
Cole
-
Hi Cole
That kind of duplication will almost certainly negatively impact your ability to rank.
It's the kind of dupe content that Google hates - the kind that's deliberately manipulative and used by sites just trying to rank for as many different KWs or locations as possible, without trying to give people a unique user experience.
Not to say that you couldn't possibly rank like this (I've seen it happen and will probably see it again in the future), but you're leaving yourself wide open to a Panda penalty and, as such, I'd highly recommend that you cater each site and each landing page to your particular audience. Even by doing that, not only will you be making it unique but you would dramatically improve your chances of ranking by mentioning local things for a local page.
Give each page unique copy and really tailor it to your local audience.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Duplicate content because of member only restrictions on a forum.
Our website's Community Forum links to the membership profile pages, which by default are blocked for non-members. https://www.foodbloggerpro.com/community/ https://www.foodbloggerpro.com/community/member/1301/ We're getting warnings in Moz for duplicate content (and errors) on these member profile pages. Any ideas for how we can creatively solve this problem? Should we redirect those pages or just beef them up with more content? Just ignore it and assume that search spiders will be smart enough to figure it out? See attached video for further explanation. Community_Area.mp4
On-Page Optimization | | Bjork0 -
Duplicate Content for Event Pages
Hi Folks, I have event pages for specific training courses running on certain dates, the problem I have is that MOZ indicates that I have 1040 duplicate content issues because I'm serving pages like this https://purplegriffon.com/event/2521/mop-practitioner I'm not sure how best to go about resolving this as, of course, although each event is unique in terms of it's start date, the courses and locations could be identical. Will Google penalise us for these types of pages, or will they even index them? Should I add a canonical link to the head of the document pointing to the related course page such as https://purplegriffon.com/courses/project-management/mop-management-of-portfolios/mop-practitioner. Will this solve the issue? I'm a little stuck on what to do for the best. Any advice would be much appreciated. Thanks. Kind Regards Gareth Daine
On-Page Optimization | | PurpleGriffon0 -
Where to add new content
I run a vBulletin website and vBulletin isnt very SEO friendly. I do fairly well in Google for most of my keywords, but forums dont necessarily build strong page authority etc. My site deals with fishing reports across the state of VA and drives 15-18k sessions a month and close to 100,000 page views a month based on Google Analytics. I want to start targeting new keywords and I am concerned about vBulletin inability to be SEO friendly. Many of my new keywords arent dynamic like fishing reports that are added by members daily. These are more like campgrounds, marinas etc. My thought is to install a Wordpress blog and build out this content so I can efficiently deal with on page SEO. the vBulletin software is installed in the root so I would install wordpress in something like mydomain/lake123/ Is the right thing to do, and will google see multiple sitemaps (one for vbulletin and another for wordpress) and index appropriately? Am I missing something major here? Thanks ~ Brian
On-Page Optimization | | FCBCO0 -
Duplicate content - Opencart
In my last report I have a lot of duplicate content. Duplicate pages are: http://mysite.com/product/search&filter_tag=Сваров�% http://mysite.com/product/search&filter_tag=бижу http://mysite.com/product/search&filter_tag=бижузо�%8 And a lot of more, starting with -- http://mysite.com/product/search&filter_tag= Any ideas? Maybe I should do something in robots.txt, but please tell me the exact code. Best Regards, Emil
On-Page Optimization | | famozni0 -
Duplicate Content - Delete it or NoIndex?
Last month I realized that one of my freelancers had been feeding my website with copied / spun content and sadly, there's lots of it. And of course it got my website to be hit hard by the last Panda update. Now that I've identified the content, what the best thing to do? Should I delete it permanently and get 404 errors or should I set the pages' robot meta tag to "nofollow"?
On-Page Optimization | | sbrault740 -
WordPress - duplicate content
I'm using WordPress for my website. However, whenever I use the post section for news, I get a report back from SEOmoz saying that there's duplicate content. What it does is it posts them in the Category and Archive section. Does anyone know if Google sees this as duplicate content and if so how to stop it? Thanks
On-Page Optimization | | AAttias0 -
Checking for content originality in a site
two part question on original content How would you go about checking if a site holds original content accept the long search quary within Google? ans also if I find many sites carrying my content and I am the original source should I replace the content? thanks
On-Page Optimization | | ciznerguy0