Is Noindex Enough To Solve My Duplicate Content Issue?
-
Hello SEO Gurus!
I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article.
Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well.
My concern is duplicate content.
In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit.
I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution.
Thanks in advance!
Kind Regards,
Mike
-
Definitely deal with the security issues! Good find there...
Regarding the client who wants to republish the same article on multiple sites, I think that noindexing it on all but the original site is perfectly fine.
Or, alternatively, place a canonical tag on the duplicate sites to let Google know where the true source lies.
-
Good thread and I agree with everything Brian has already said. One additional option that hasn't been mentioned is possibly using Repost.us . If your client's blogs are on WordPress, there is a nifty Repost.Us plugin, very easy to install. He could then use this to repost the content on his main blogs, without having duplicate content issues or problems for his SEO. It would get the content where he wants it, preserve authorship plus give a link back to his main site. He would also have the opportunity of monetizing his posts if that was something he wanted to do. Hope this is helpful!
Dana
-
Wow, that's new! Yes, I wouldn't be surprised if the plug-in is at fault.
Well, as usual, issues compound into new issues.
My many thanks for your help and insight, Brian.
Kind Regards,
Mike
-
Wasn't able to visit the site, got this warning, attached.
Kinda poignant that this warning from the Fiji site gave me a warning referencing the Pacific site, which is exactly the kind of thing we're talking about.
Wonder if the very plugin your client is using is causing this issue too. -
Sure, here's an example: this is the main website: beautifulpacific.com, with the blog being located at beautifulpacific.com/blog. One of the satellite sites is beautifulfiji.com, with its blog at beautifulfiji.com/blog.
-
_To me, the best-case scenario would be to use these blogs to pump out fresh, authoritative content for each satellite site blog -- a more intensive undertaking, to be sure, but a best practice -- and include an RSS feed. _
Agreed. Also, there's no reason he can't write a post for one audience that references a post he made on another domain. It's hard to get a good feel for the whole situation without viewing the sites and blogs themselves.
-
Many thanks for your reply, Brian.
The satellite websites are not where conversations/sales take place; they feed his main site. I agree that providing a feed via the blog's RSS would make more sense. And when you say, "but if the point of the content is to be consumed, enjoyed, attract social shares and links, build traffic and then convert, then there's really little if any gain to be had in [noindexing]," I wholeheartedly agree. Even if it were to solve the duplicate content issue, it would preclude us from being able to put fresh content up on that blog and leverage it accordingly.
I can tell you that there is nothing nefarious in the client's idea here: his intentions are purely to give users fresh content to explore on the satellite sites. But as he relies on me to guide him in terms of SEO implications, I don't think he thought through how duplicate content could hurt him.
To me, the best-case scenario would be to use these blogs to pump out fresh, authoritative content for each satellite site blog -- a more intensive undertaking, to be sure, but a best practice -- and include an RSS feed.
-
Have you suggested he use an iframe to host the content from one site into the satellites?
Or maybe simply a feed to show the fresh content to visitors?
Does he convert on those satellite sites or are they micros to drive to the main?The thing is, it is definitely going to be duplicate content, and since the host is presumably the same... well... Not good.
I would ask: "why?" He is expecting to get links to this content on this site one day, the same content on this site the next? If it's a good post, what would happen if someone shares it socially from one domain, and those exposed to it see it elsewhere?
I think noindexing is a good half measure, but if the point of the content is to be consumed, enjoyed, attract social shares and links, build traffic and then convert, then there's really little if any gain to be had in even doing that. A noindexed blog post getting links? A noindexed blog category getting social buzz?
Force your client to understand the end goal. If he just wants something for them to read, add a feed. Then the social shares and links will do some good to at least the most important domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | | Digital-DMG0 -
URL Mixed Cases and Duplicate Content
Hi There, I have a question for you. I am working on a website where by typing any letter of the URL in lower or upper case, it will give a 200 code. Examples www.examples.com/page1/product www.examples.com/paGe1/Product www.examples.com/PagE1/prOdUcT www.examples.com/pAge1/proODUCt and so on… Although I cannot find evidence of backlinks pointing to my page with mixed cases, shall I redirect or rel=canonical all the possible combination of the cases to a lower version of them in order to prevent duplicate content? And if so, do you have any advice on how to complete such a massive job? Thanks a lot
Technical SEO | | Midleton0 -
Tired of finding solution for duplicate contents.
Just my site was scanned by seomoz and seen lots of duplicate content and titles found. Well I am tired of finding solutions of duplicate content for a shopping site product category page. You can see the screenshot below. http://i.imgur.com/TXPretv.png You can see below in every link its showing "items_per_page=64, 128 etc.". This happened in every category in which I was created. I am already using Canonical add-on to avoid this problem but still it's there. You can check my domain here - http://www.plugnbuy.com/computer-software/pc-security/antivirus-internet-security/ and see if the add-on working correct. I recently submitted my sitemap to GWT, so that's why it's not showing me any report regarding duplicate issues. Please help ME
Technical SEO | | chandubaba0 -
Caps in URL creating duplicate content
Im getting a bunch of duplicate content errors where the crawl is saying www.url.com/abc has duplicate at www.url.com/ABC The content is in magento and the url settings are lowercase, and I cant figure out why it thinks there is duplicate consent. These are pages with a decent number of inbound links.
Technical SEO | | JohnBerger0 -
Duplicate Content and URL Capitalization
I have multiple URLs that SEOMoz is reporting as duplicate content. The reason is that there are characters in the URL that may, or may not, be capitalized depending on user input. A couple examples are: www.househitz.com/Pennsylvania/Houses-for-sale www.househitz.com/Pennsylvania/houses-for-sale www.househitz.com/Pennsylvania/Houses-for-rent www.househitz.com/Pennsylvania/houses-for-rent There are currently thousands of instances of this on the site. Is this something I should spend effort to try and resolve (may not be minor effort), or should I just ignore it and move on?
Technical SEO | | Jom0 -
Lots of duplicate content warnings
I have a site that says that I have 2,500 warnings. It is a real estate website and of course we use feeds. it says I have a lot of duplicate content. One thing is a page called "Request an appointment" and that is a url for each listing. Since there are 800 listings on my site. How could I solve this problem so that this doesn't show up as duplicate content since I use the same "Request an Appointment" verbeage on each of those? I guess my developer who used php to do it, created a dedicated url to each. Any help would be greatly appreciated.
Technical SEO | | SeaC0 -
Duplicate Page Content
Hi within my campaigns i get an error "crawl errors found" that says duplicate page content found, it finds the same content on the home pages below. Are these seen as two different pages? And how can i correct these errors as they are just one page? http://poolstar.net/ http://poolstar.net/Home_Page.php
Technical SEO | | RouteAccounts0 -
Canonical Link for Duplicate Content
A client of ours uses some unique keyword tracking for their landing pages where they append certain metrics in a query string, and pulls that information out dynamically to learn more about their traffic (kind of like Google's UTM tracking). Non-the-less these query strings are now being indexed as separate pages in Google and Yahoo and are being flagged as duplicate content/title tags by the SEOmoz tools. For example: Base Page: www.domain.com/page.html
Technical SEO | | kchandler
Tracking: www.domain.com/page.html?keyword=keyword#source=source Now both of these are being indexed even though it is only one page. So i suggested placing an canonical link tag in the header point back to the base page to start discrediting the tracking URLs: But this means that the base pages will be pointing to themselves as well, would that be an issue? Is their a better way to solve this issue without removing the query tracking all togther? Thanks - Kyle Chandler0