How best to handle (legitimate) duplicate content?
-
Hi everyone, appreciate any thoughts on this. (bit long, sorry)
Am working on 3 sites selling the same thing...main difference between each site is physical location/target market area (think North, South, West as an example)
Now, say these 3 sites all sell Blue Widgets, and thus all on-page optimisation has been done for this keyword.
These 3 sites are now effectively duplicates of each other - well the Blue Widgets page is at least, and whist there are no 'errors' in Webmaster Tools am pretty sure they ought to be ranking better than they are (good PA, DA, mR etc)
Sites share the same template/look and feel too AND are accessed via same IP - just for good measure
So - to questions/thoughts.
1 - Is it enough to try and get creative with on-page changes to try and 'de-dupe' them? Kinda tricky with Blue Widgets example - how many ways can you say that? I could focus on geographical element a bit more, but would like to rank well for Blue Widgets generally.
2 - I could, i guess, no-index, no-follow, blue widgets page on 2 of the sites, seems a bit drastic though. (or robots.txt them)
3 - I could even link (via internal navigation) sites 2 and 3 to site 1 Blue Widgets page and thus make 2 blue widget pages redundant?
4 - Is there anything HTML coding wise i could do to pull in Site 1 content to sites 2 and 3, without cloaking or anything nasty like that?
I think 1- is first thing to do. Anything else? Many thanks.
-
I think your header links will look spammy.
Also, your sharing out our Page Rank to your duplicate sites! I would either remove the links or no follow (are the links of value to your visitors? if not get rid!).
-
Great help here folks, thanks.
One last question if i may - each of the 3 sites links to the other 2 in the header (on every page), so i've got x00 cross-referencing links.
Any value in making them rel=no-follow? Don't want to remove them necessarily.
-
IIS7 supports a type of mod_rewrite. But even if you can't use that, you should have access to ASP or .NET and can easily use those to do your 301s
-
ISS has no problems doing 301s, and if you can use php, asp or anything similar you can just manualy put a 301 on each page if that fails.
No rel-canonical solution will result in all 3 sites ranking as far as I am aware.
Your best option is usualy one site with geo-located pages. If it has to be 3 sites, then the only real option is to make all that content unique, on unique ips e.t.c., which at the end of the day is 3X the work or more.
-
No problem, best of luck and let us know how you get on!
-
Thanks for all the replies everyone. Tricky isn't it?
Moving to 1 site is probably the best medium/long-term option. The 3 sites thing is historical in that sites 2 and 3 were purchased (physically) by the owner over last few years.
Biggest problem with totally new is that (afaik anyway, according to hosting company) i can't 301 old sites to this new site due to the shared hosting issue (using IIS as well, not Apache), so perhaps getting them split out is proper interim measure. (I might be able to do something via WMTools with this though i guess)
Will do some more research into use of canonical cross-domain and attempt the on-page rewrite as well as talking to client about moving sites to unique hosts.
thanks again.
-
why is it hard to restate the content in a different way? reword it. If it's products then change the order and write unique content on the bottom. By east west north south exactly what types of regions are you talking about and why do you need three sites to accomplish this instead of one with geo targeted LPs?
-
you can certainly use the canonical, however you probably wont rank from domains 2 and 3 as your telling Google not to attribute the content to those domains.
I'm still missing the bit where having thee regionalized sites is beneficial to your visitors, why not make one general site with the products and then do some geo-targeted pages?(thats what I would do, makes for a much simpler task).
best of luck with which ever way you go, but come back and let us know what happens
-
The benefit to the user is that they will need to visit physical site to view/purchase and as such, wouldn't click on say, North site (even if it was top 2 or 3) if they were in South.
Are you (both) saying it'd be ok to link rel canonical domain1/page.html on domains 2 and 3? (i.e. different domain names)
Thanks.
-
how is this for good measure?
"Sites share the same template/look and feel too AND are accessed via same IP - just for good measure :)"
Make them as unique and separate as possible. Different templates, different hosting, different email contact, different contact info on domain registration, write content on the page and geo target the wording.
-
What is the benefit to the user for an individual sites for North, South and west?
Are you not just creating a lot of work for yourself, especially since as you state ''would like to rank well for Blue Widgets generally" which ultimately means each site is competing against the others.
I would rethink my strategy, Your more likely to rank 'generally' for your chosen terms if you focus your efforts on one site and perhaps use canonical tags on the other two to ensure Google knows who to attribute the content too.
-
There's not too many options here. Geotargeted (even locally) tends to produce duplicate content. The only option, really, is to canonical all your products to one place. If you do it right, you might be able to rank all three sites for your keyword.
You can try #1 but, as you said, it's hard to restate the same content in a non-duplicated way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content with URL Parameters
Moz is picking up a large quantity of duplicate content, consists mainly of URL parameters like ,pricehigh & ,pricelow etc (for page sorting). Google has indexed a large number of the pages (not sure how many), not sure how many of them are ranking for search terms we need. I have added the parameters into Google Webmaster tools And set to 'let google decide', However Google still sees it as duplicate content. Is it a problem that we need to address? Or could it do more harm than good in trying to fix it? Has anyone had any experience? Thanks
Intermediate & Advanced SEO | | seoman100 -
Duplicate content issue with pages that have navigation
We have a large consumer website with several sections that have navigation of several pages. How would I prevent the pages from getting duplicate content errors and how best would I handle SEO for these? For example we have about 500 events with 20 events showing on each page. What is the best way to prevent all the subsequent navigation pages from getting a duplicate content and duplicate title error?
Intermediate & Advanced SEO | | roundbrix0 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Category Pages For Distributing Authority But Not Creating Duplicate Content
I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure. I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct? Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages. Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out. Thanks! Best.. Mike
Intermediate & Advanced SEO | | 945010 -
Multiply domains and duplicate content confusion
I've just found out that a client has multiple domains which are being indexed by google and so leading me to worry that they will be penalised for duplicate content. Wondered if anyone could confirm a) are we likely to be penalised? and b) what should we do about it? (i'm thinking just 301 redirect each domain to the main www.clientdomain.com...?). Actual domain = www.clientdomain.com But these also exist: www.hostmastr.clientdomain.com www.pop.clientdomain.com www.subscribers.clientdomain.com www.www2.clientdomain.com www.wwwww.clientdomain.com ps I have NO idea how/why all these domains exist I really appreciate any expertise on this issue, many thanks!
Intermediate & Advanced SEO | | bisibee10 -
Need help with duplicate content. Same content; different locations.
We have 2 sites that will have duplicate content (e.g., one company that sells the same products under two different brand names for legal reasons). The two companies are in different geographical areas, but the client will put the same content on each page because they're the same product. What is the best way to handle this? Thanks a lot.
Intermediate & Advanced SEO | | Rocket.Fuel0 -
Duplicate Content on Product Pages
I'm getting a lot of duplicate content errors on my ecommerce site www.outdoormegastore.co.uk mainly centered around product pages. The products are completely different in terms of the title, meta data, product descriptions and images (with alt tags)but SEOmoz is still identifying them as duplicates and we've noticed a significant drop in google ranking lately. Admittedly the product descriptions are a little bit thin but I don't understand why the pages would be viewed as duplicates and therefore can be ranked lower? The content is definitely unique too. As an example these three pages have been identified as being duplicates of each other. http://www.outdoormegastore.co.uk/regatta-landtrek-25l-rucksack.html http://www.outdoormegastore.co.uk/canyon-bryce-adult-cycling-helmet-9045.html http://www.outdoormegastore.co.uk/outwell-minnesota-6-carpet-for-green-07-08-tent.html
Intermediate & Advanced SEO | | gavinhoman0 -
Duplicate content from development website
Hi all - I've been trawling for duplicate content and then I stumbled across a development URL, set up by a previous web developer, which nearly mirrors current site (few content and structure changes since then, but otherwise it's all virtually the same). The developer didn't take it down when the site was launched. I'm guessing the best thing to do is tell him to take down the development URL (which is specific to the pizza joint btw, immediately. Is there anything else I should ask him to do? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0