How best to handle (legitimate) duplicate content?
-
Hi everyone, appreciate any thoughts on this. (bit long, sorry)
Am working on 3 sites selling the same thing...main difference between each site is physical location/target market area (think North, South, West as an example)
Now, say these 3 sites all sell Blue Widgets, and thus all on-page optimisation has been done for this keyword.
These 3 sites are now effectively duplicates of each other - well the Blue Widgets page is at least, and whist there are no 'errors' in Webmaster Tools am pretty sure they ought to be ranking better than they are (good PA, DA, mR etc)
Sites share the same template/look and feel too AND are accessed via same IP - just for good measure
So - to questions/thoughts.
1 - Is it enough to try and get creative with on-page changes to try and 'de-dupe' them? Kinda tricky with Blue Widgets example - how many ways can you say that? I could focus on geographical element a bit more, but would like to rank well for Blue Widgets generally.
2 - I could, i guess, no-index, no-follow, blue widgets page on 2 of the sites, seems a bit drastic though. (or robots.txt them)
3 - I could even link (via internal navigation) sites 2 and 3 to site 1 Blue Widgets page and thus make 2 blue widget pages redundant?
4 - Is there anything HTML coding wise i could do to pull in Site 1 content to sites 2 and 3, without cloaking or anything nasty like that?
I think 1- is first thing to do. Anything else? Many thanks.
-
I think your header links will look spammy.
Also, your sharing out our Page Rank to your duplicate sites! I would either remove the links or no follow (are the links of value to your visitors? if not get rid!).
-
Great help here folks, thanks.
One last question if i may - each of the 3 sites links to the other 2 in the header (on every page), so i've got x00 cross-referencing links.
Any value in making them rel=no-follow? Don't want to remove them necessarily.
-
IIS7 supports a type of mod_rewrite. But even if you can't use that, you should have access to ASP or .NET and can easily use those to do your 301s
-
ISS has no problems doing 301s, and if you can use php, asp or anything similar you can just manualy put a 301 on each page if that fails.
No rel-canonical solution will result in all 3 sites ranking as far as I am aware.
Your best option is usualy one site with geo-located pages. If it has to be 3 sites, then the only real option is to make all that content unique, on unique ips e.t.c., which at the end of the day is 3X the work or more.
-
No problem, best of luck and let us know how you get on!
-
Thanks for all the replies everyone. Tricky isn't it?
Moving to 1 site is probably the best medium/long-term option. The 3 sites thing is historical in that sites 2 and 3 were purchased (physically) by the owner over last few years.
Biggest problem with totally new is that (afaik anyway, according to hosting company) i can't 301 old sites to this new site due to the shared hosting issue (using IIS as well, not Apache), so perhaps getting them split out is proper interim measure. (I might be able to do something via WMTools with this though i guess)
Will do some more research into use of canonical cross-domain and attempt the on-page rewrite as well as talking to client about moving sites to unique hosts.
thanks again.
-
why is it hard to restate the content in a different way? reword it. If it's products then change the order and write unique content on the bottom. By east west north south exactly what types of regions are you talking about and why do you need three sites to accomplish this instead of one with geo targeted LPs?
-
you can certainly use the canonical, however you probably wont rank from domains 2 and 3 as your telling Google not to attribute the content to those domains.
I'm still missing the bit where having thee regionalized sites is beneficial to your visitors, why not make one general site with the products and then do some geo-targeted pages?(thats what I would do, makes for a much simpler task).
best of luck with which ever way you go, but come back and let us know what happens
-
The benefit to the user is that they will need to visit physical site to view/purchase and as such, wouldn't click on say, North site (even if it was top 2 or 3) if they were in South.
Are you (both) saying it'd be ok to link rel canonical domain1/page.html on domains 2 and 3? (i.e. different domain names)
Thanks.
-
how is this for good measure?
"Sites share the same template/look and feel too AND are accessed via same IP - just for good measure :)"
Make them as unique and separate as possible. Different templates, different hosting, different email contact, different contact info on domain registration, write content on the page and geo target the wording.
-
What is the benefit to the user for an individual sites for North, South and west?
Are you not just creating a lot of work for yourself, especially since as you state ''would like to rank well for Blue Widgets generally" which ultimately means each site is competing against the others.
I would rethink my strategy, Your more likely to rank 'generally' for your chosen terms if you focus your efforts on one site and perhaps use canonical tags on the other two to ensure Google knows who to attribute the content too.
-
There's not too many options here. Geotargeted (even locally) tends to produce duplicate content. The only option, really, is to canonical all your products to one place. If you do it right, you might be able to rank all three sites for your keyword.
You can try #1 but, as you said, it's hard to restate the same content in a non-duplicated way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Best to Handle Inherited 404s on Purchased Domain
We purchased a domain from another company and migrated our site over to it very successfully. However, we have one artifact of the original domain in that there was a page that was exploited by other sites on the web. This page allowed you to pass any URL to it and redirect to that URL (e.g. http://example.com/go/to/offsite_link.asp?GoURL=http://badactor.com/explicit_content). This page does not exist on our site so the results always go to a 404 on our site. However, we find that crawlers are still attempting to access these invalid pages. We have disavowed as many of the explicit sites as we can, but still some crawlers come looking for those links. We are considering blocking the redirect page in our robots.txt but we are concerned that the links will remain indexed but uncrawlable. What's the best way to pull these pages from search engines and never have them crawled again? UPDATE: Clarifying that what we're trying to do it get search engines to just never try to get to these pages. We feel the fact they're even wasting their time on getting a 404 is what we're trying to avoid. Is there any reason we shouldn't just block these in our robots.txt?
Intermediate & Advanced SEO | | russell_ms1 -
Duplicate content across different domains in different countries?
Hi Guys, We have a 4 sites One in NZ, UK, Canada and Australia. All geo-targeting their respective countries in Google Search Console. The sites are identical. We recently added the same content to all 4 sites. Will this cause duplicate content issues or any issues even though they are in different countries and geo-targeting is set? Cheers.
Intermediate & Advanced SEO | | wickstar0 -
Product Syndication and duplicate content
Hi, It's a duplicate content question. We sell products (vacation rental homes) on a number of websites as well as our own. Generally, these affiliate sites have a higher domain authority and much more traffic than our site. The product content (text, images, and often availability and rates) is pulled by our affiliates into their websites daily and is exactly the same as the content on our site, not including their page structure. We receive enquiries by email and any links from their domains to ours are nofollow. For example, all of the listing text on mysite.com/listing_id is identical to my-first-affiliate-site.com/listing_id and my-second-affiliate-site.com/listing_id. Does this count as duplicate content and, if so, can anyone suggest a strategy to make the best of the situation? Thanks
Intermediate & Advanced SEO | | McCaldin0 -
Duplicate Content Question
We are getting ready to release an integration with another product for our app. We would like to add a landing page specifically for this integration. We would also like it to be very similar to our current home page. However, if we do this and use a lot of the same content, will this hurt our SEO due to duplicate content?
Intermediate & Advanced SEO | | NathanGilmore0 -
K3 duplicate page content and title tags
I'm running a Joomla site, have just installed k2 as our blogging platform. Our Crawl Report with SEOMOZ shows a good bit of duplicate content and duplicate title tags with our K2 blog. We've installed sh404SEF. Will I need to go into sh404SEF each time we generate a blog entry to point the titles to one URL? If there is something simpler please advise. Thank you, Don
Intermediate & Advanced SEO | | donaldmoore0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Is this duplicate content something to be concerned about?
On the 20th February a site I work on took a nose-dive for the main terms I target. Unfortunately I can't provide the url for this site. All links have been developed organically so I have ruled this out as something which could've had an impact. During the past 4 months I've cleaned up all WMT errors and applied appropriate redirects wherever applicable. During this process I noticed that mydomainname.net contained identical content to the main mydomainname.com site. Upon discovering this problem I 301 redirected all .net content to the main .com site. Nothing has changed in terms of rankings since doing this about 3 months ago. I also found paragraphs of duplicate content on other sites (competitors in different countries). Although entire pages haven't been copied there is still enough content to highlight similarities. As this content was written from scratch and Google would've seen this within it's crawl and index process I wanted to get peoples thoughts as to whether this is something I should be concerned about? Many thanks in advance.
Intermediate & Advanced SEO | | bfrl0 -
How to deal with category browsing and duplicate content
On an ecommerce site there are typically a lot of pages that may appear to be duplications due to category browse results where the only difference may be the sorting by price or number of products per page. How best to deal with this? Add nofollow to the sorting links? Set canonical values that ignore these variables? Set cononical values that match the category home page? Is this even a possible problem with Panda or spiders in general?
Intermediate & Advanced SEO | | IanTheScot0