Is Noindex Enough To Solve My Duplicate Content Issue?
-
Hello SEO Gurus!
I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article.
Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well.
My concern is duplicate content.
In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit.
I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution.
Thanks in advance!
Kind Regards,
Mike
-
Definitely deal with the security issues! Good find there...
Regarding the client who wants to republish the same article on multiple sites, I think that noindexing it on all but the original site is perfectly fine.
Or, alternatively, place a canonical tag on the duplicate sites to let Google know where the true source lies.
-
Good thread and I agree with everything Brian has already said. One additional option that hasn't been mentioned is possibly using Repost.us . If your client's blogs are on WordPress, there is a nifty Repost.Us plugin, very easy to install. He could then use this to repost the content on his main blogs, without having duplicate content issues or problems for his SEO. It would get the content where he wants it, preserve authorship plus give a link back to his main site. He would also have the opportunity of monetizing his posts if that was something he wanted to do. Hope this is helpful!
Dana
-
Wow, that's new! Yes, I wouldn't be surprised if the plug-in is at fault.
Well, as usual, issues compound into new issues.
My many thanks for your help and insight, Brian.
Kind Regards,
Mike
-
Wasn't able to visit the site, got this warning, attached.
Kinda poignant that this warning from the Fiji site gave me a warning referencing the Pacific site, which is exactly the kind of thing we're talking about.
Wonder if the very plugin your client is using is causing this issue too. -
Sure, here's an example: this is the main website: beautifulpacific.com, with the blog being located at beautifulpacific.com/blog. One of the satellite sites is beautifulfiji.com, with its blog at beautifulfiji.com/blog.
-
_To me, the best-case scenario would be to use these blogs to pump out fresh, authoritative content for each satellite site blog -- a more intensive undertaking, to be sure, but a best practice -- and include an RSS feed. _
Agreed. Also, there's no reason he can't write a post for one audience that references a post he made on another domain. It's hard to get a good feel for the whole situation without viewing the sites and blogs themselves.
-
Many thanks for your reply, Brian.
The satellite websites are not where conversations/sales take place; they feed his main site. I agree that providing a feed via the blog's RSS would make more sense. And when you say, "but if the point of the content is to be consumed, enjoyed, attract social shares and links, build traffic and then convert, then there's really little if any gain to be had in [noindexing]," I wholeheartedly agree. Even if it were to solve the duplicate content issue, it would preclude us from being able to put fresh content up on that blog and leverage it accordingly.
I can tell you that there is nothing nefarious in the client's idea here: his intentions are purely to give users fresh content to explore on the satellite sites. But as he relies on me to guide him in terms of SEO implications, I don't think he thought through how duplicate content could hurt him.
To me, the best-case scenario would be to use these blogs to pump out fresh, authoritative content for each satellite site blog -- a more intensive undertaking, to be sure, but a best practice -- and include an RSS feed.
-
Have you suggested he use an iframe to host the content from one site into the satellites?
Or maybe simply a feed to show the fresh content to visitors?
Does he convert on those satellite sites or are they micros to drive to the main?The thing is, it is definitely going to be duplicate content, and since the host is presumably the same... well... Not good.
I would ask: "why?" He is expecting to get links to this content on this site one day, the same content on this site the next? If it's a good post, what would happen if someone shares it socially from one domain, and those exposed to it see it elsewhere?
I think noindexing is a good half measure, but if the point of the content is to be consumed, enjoyed, attract social shares and links, build traffic and then convert, then there's really little if any gain to be had in even doing that. A noindexed blog post getting links? A noindexed blog category getting social buzz?
Force your client to understand the end goal. If he just wants something for them to read, add a feed. Then the social shares and links will do some good to at least the most important domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issue with duplicate content
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot. I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue. Thank you!
Technical SEO | | odmsoft0 -
Duplicate content. Wordpress and Website
Hi All, Will Google punish me for having duplicate blog posts on my website's blog and wordpress? Thanks
Technical SEO | | Mike.NW0 -
Duplicate Content Issue
My issue with duplicate content is this. There are two versions of my website showing up http://www.example.com/ http://example.com/ What are the best practices for fixing this? Thanks!
Technical SEO | | OOMDODigital0 -
Ways of Helping Reducing Duplicate Content.
Hi I am looking to no of anyway there is at helping to reduce duplicate content on a website with out breaking link and affecting Google rankings.
Technical SEO | | Feily0 -
Bad Duplicate content issue
Hi, for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content). What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ? It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff. Thanks in advance.
Technical SEO | | nico860 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
Duplicate Page Content and Titles
A few weeks ago my error count went up for Duplicate Page Content and Titles. 4 errors in all. A week later the errors were gone... But now they are back. I made changes to the Webconfig over a month ago but nothing since. SEOmoz is telling me the duplicate content is this http://www.antiquebanknotes.com/ and http://www.antiquebanknotes.com Thanks for any advise! This is the relevant web.config. <rewrite><rules><rule name="CanonicalHostNameRule1"><match url="(.*)"><conditions><add input="{HTTP_HOST}" pattern="^www.antiquebanknotes.com$" negate="true"></add></conditions>
Technical SEO | | Banknotes
<action type="Redirect" url="<a href=" http:="" www.antiquebanknotes.com="" {r:1"="">http://www.antiquebanknotes.com/{R:1}" />
</action></match></rule>
<rule name="Default Page" enabled="true" stopprocessing="true"><match url="^default.aspx$"><conditions logicalgrouping="MatchAll"><add input="{REQUEST_METHOD}" pattern="GET"></add></conditions>
<action type="Redirect" url="/"></action></match></rule></rules></rewrite>0 -
CGI Parameters: should we worry about duplicate content?
Hi, My question is directed to CGI Parameters. I was able to dig up a bit of content on this but I want to make sure I understand the concept of CGI parameters and how they can affect indexing pages. Here are two pages: No CGI parameter appended to end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html CGI parameter appended to the end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html?pagewanted=2&ref=homepage&src=mv Questions: Can we safely say that CGI parameters = URL parameters that append to the end of a URL? Or are they different? And given that you have rel canonical implemented correctly on your pages, search engines will move ahead and index only the URL that is specified in that tag? Thanks in advance for giving your insights. Look forward to your response. Best regards, Jackson
Technical SEO | | jackson_lo0