Is Noindex Enough To Solve My Duplicate Content Issue?
-
Hello SEO Gurus!
I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article.
Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well.
My concern is duplicate content.
In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit.
I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution.
Thanks in advance!
Kind Regards,
Mike
-
Definitely deal with the security issues! Good find there...
Regarding the client who wants to republish the same article on multiple sites, I think that noindexing it on all but the original site is perfectly fine.
Or, alternatively, place a canonical tag on the duplicate sites to let Google know where the true source lies.
-
Good thread and I agree with everything Brian has already said. One additional option that hasn't been mentioned is possibly using Repost.us . If your client's blogs are on WordPress, there is a nifty Repost.Us plugin, very easy to install. He could then use this to repost the content on his main blogs, without having duplicate content issues or problems for his SEO. It would get the content where he wants it, preserve authorship plus give a link back to his main site. He would also have the opportunity of monetizing his posts if that was something he wanted to do. Hope this is helpful!
Dana
-
Wow, that's new! Yes, I wouldn't be surprised if the plug-in is at fault.
Well, as usual, issues compound into new issues.
My many thanks for your help and insight, Brian.
Kind Regards,
Mike
-
Wasn't able to visit the site, got this warning, attached.
Kinda poignant that this warning from the Fiji site gave me a warning referencing the Pacific site, which is exactly the kind of thing we're talking about.
Wonder if the very plugin your client is using is causing this issue too. -
Sure, here's an example: this is the main website: beautifulpacific.com, with the blog being located at beautifulpacific.com/blog. One of the satellite sites is beautifulfiji.com, with its blog at beautifulfiji.com/blog.
-
_To me, the best-case scenario would be to use these blogs to pump out fresh, authoritative content for each satellite site blog -- a more intensive undertaking, to be sure, but a best practice -- and include an RSS feed. _
Agreed. Also, there's no reason he can't write a post for one audience that references a post he made on another domain. It's hard to get a good feel for the whole situation without viewing the sites and blogs themselves.
-
Many thanks for your reply, Brian.
The satellite websites are not where conversations/sales take place; they feed his main site. I agree that providing a feed via the blog's RSS would make more sense. And when you say, "but if the point of the content is to be consumed, enjoyed, attract social shares and links, build traffic and then convert, then there's really little if any gain to be had in [noindexing]," I wholeheartedly agree. Even if it were to solve the duplicate content issue, it would preclude us from being able to put fresh content up on that blog and leverage it accordingly.
I can tell you that there is nothing nefarious in the client's idea here: his intentions are purely to give users fresh content to explore on the satellite sites. But as he relies on me to guide him in terms of SEO implications, I don't think he thought through how duplicate content could hurt him.
To me, the best-case scenario would be to use these blogs to pump out fresh, authoritative content for each satellite site blog -- a more intensive undertaking, to be sure, but a best practice -- and include an RSS feed.
-
Have you suggested he use an iframe to host the content from one site into the satellites?
Or maybe simply a feed to show the fresh content to visitors?
Does he convert on those satellite sites or are they micros to drive to the main?The thing is, it is definitely going to be duplicate content, and since the host is presumably the same... well... Not good.
I would ask: "why?" He is expecting to get links to this content on this site one day, the same content on this site the next? If it's a good post, what would happen if someone shares it socially from one domain, and those exposed to it see it elsewhere?
I think noindexing is a good half measure, but if the point of the content is to be consumed, enjoyed, attract social shares and links, build traffic and then convert, then there's really little if any gain to be had in even doing that. A noindexed blog post getting links? A noindexed blog category getting social buzz?
Force your client to understand the end goal. If he just wants something for them to read, add a feed. Then the social shares and links will do some good to at least the most important domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Selling same products under separate brands and can't consolidate sites...duplicate content issues?
I have a client selling home goods online and in-store under two different brand names in separate regions of the country. Currently, the websites are completely identical aside from branding. It is unlikely that they would have the capacity to write unique titles and page content for each website (~25,000 pages each), and the business would never consolidate the sites. Would it make sense to use canonical tags pointing to the higher-performing website on category and product pages? This way we could continue to capture branded search to the lesser brand while consolidating authority on the better performing website. What would you do?
Technical SEO | | jluke.fusion0 -
Finding a specific link - Duplicating my own content
Hi Mozzers, This may be a bit of a n00b question and i feel i should know the answer but alas, here i am asking. I have a page www.website.co.uk/page/ and im getting a duplicate page report of www.website.co.uk/Page/ i know this is because somewhere on my website a link will exists using the capitalised version. I have tried everything i can think of to find it but with no luck, any little tricks? I could always rewrite the urls to lowercase, but I have downloadable software etc also on the website that i dont want to take the capitals out of. So the best solution seems to be finding the link and remove it. Most link checkers I use treat the capitalised and non capitalised as the same thing so really arent helping lol.
Technical SEO | | ATP0 -
Duplicate page/Title content - Where?
Hi, I have just run a crawl on a new clients site, and there is several 'duplicate page content' and 'Duplicate Page Title'' issues. But I cannot find any duplicate content. And to make matters worse. The actual report has confused me. Just for example the about us page is showing in both reports and for both under 'Other URLs' it is showing 1? Why? Does this mean there is 1 other page with duplicate page title? or duplicate page content? Where are the pages that have the duplicate page titles, or duplicate page content? I have run scans using other software and a copyscape scan. And apart from missing page titles, I cannot find any page that has duplicate titles or content. I can find % percentages of pages with similar/same page titles/content. But this is only partial and contextually correct. So I understand that SEO Moz may pick percentage of content, which is fine, and therefore note that there is duplicate content/page titles. But I cannot seem to figure out where I would the source of the duplicate content/page titles. As there is only 1 listed in both reports for 'Other URLs' Hopefully my long question, has not confused. many thanks in advance for any help
Technical SEO | | wood1e20 -
Duplicate Content - Products
When running a report it says we have lots of duplicate content. We are a e-commerce site that has about 45,000 sku's on the site. Products can be in multiple departments on the site. So the same products can show up on different pages of the site. Because of this the reports show multiple products with duplicate content. Is this an issue with google and site ranking? Is there a way to get around this issue?
Technical SEO | | shoedog1 -
How critical is Duplicate content warnings?
Hi, So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ. However, the crawl report returned thousands of errors and most of them are duplicate content warnings. As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know). I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler) So my question is, should I be worried about the thousands of error messages in crawler diagnostics? any ideas appreciated
Technical SEO | | Gamer070 -
Block Quotes and Citations for duplicate content
I've been reading about the proper use for block quotes and citations lately, and wanted to see if I was interpreting it the right way. This is what I read: http://www.pitstopmedia.com/sem/blockquote-cite-q-tags-seo So basically my question is, if I wanted to reference Amazon or another stores product reviews, could I use the block quote and citation tags around their content so it doesn't look like duplicate content? I think it would be great for my visitors, but also to the source as I am giving them credit. It would also be a good source to link to on my products pages, as I am not competing with the manufacturer for sales. I could also do this for product information right from the manufacturer. I want to do this for a contact lens site. I'd like to use Acuvue's reviews from their website, as well as some of their product descriptions. Of course I have my own user reviews and content for each product on my website, but I think some official copy could do well. Would this be the best method? Is this how Rottentomatoes.com does it? On every movie page they have 2-3 sentences from 50 or so reviews, and not much unique content of their own. Cheers, Vinnie
Technical SEO | | vforvinnie1 -
Help removing duplicate content from the index?
Last week, after a significant drop in traffic, I noticed a subdomain in the index with duplicate content. The main site and subdomain can be found below. http://mobile17.com http://232315.mobile17.com/ I've 301'd everything on the subdomain to the appropriate location on the main site. Problem is, site: searches show me that if the subdomain content is being deindexed, it's happening really slowly. Traffic is still down about 50% in the last week or so... what's the best way to tackle this issue moving forward?
Technical SEO | | ccorlando0 -
Duplicate Content and Canonical use
We have a pagination issue, which the developers seem reluctant (or incapable) to fix whereby we have 3 of the same page (slightly differing URLs) coming up in different pages in the archived article index. The indexing convention was very poorly thought up by the developers and has left us with the same article on, for example, page 1, 2 and 3 of the article index, hence the duplications. Is this a clear cut case of using a canonical tag? Quite concerned this is going to have a negative impact on ranking, of course. Cheers Martin
Technical SEO | | Martin_S0