Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content - Blog Rewriting
-
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website.
He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field.
To what extent would I need to rewrite each article so as to avoid duplicating the content?
Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords?
Would the articles need to be completely taken by all current publishers?
Any advice would be greatly appreciated.
-
Hi guys, have a client in a similar situation and working through best option for them...would appreciate any comments or feedback...
Current Status - client has two websites each targeting different countries: .co.nz and .com.au
With the exception of a few products that are offered separately between NZ and AU, the sites are the same. In essence duplicate content. This is due to current platform limitations (the way their web company has built the site it is same site showing in each region on separate domains with option to change products between regions using inventory an integrated inventory tool).
The great news is they are currently rebuilding their websites onto a new platform with two unique versions of the site…which will be great for ongoing SEO - ie we can really drill into creating separate sets of page, product, template content and meta data etc.
They also have a magazine running on Word Press Blog using sub-domains associated with the regional root domain. E.g.
magazine.domain.co.nz and magazine.domainname.com.auAgain, with a few exceptions, this is also duplicated for both countries…ie sub domains assigned to the same site. Again duplicate content.
Question: The magazine being built on Word Press has to date been geared at offering an “FAQ” type engagement with visitors....visitors can submit questions via module which are then answered in Word Press blog posts. There are also links from main site menu away to the magazine...so not ideal for conversion. Client wants to bring this FAQ type feature back to the two main sites and can now do so during new site rebuilds.
There is also some SEO juice in the magazine as in essence it is a large Word Press blog. I am trying to work through what would be the best option for transferring all of the FAQ answers/articles (content) from magazine FAQs to the two new main sites...so over time the two new main sites obtain that SEO strength.
Option 1
Leave magazine as it is so that main sites continue to get benefits of referral traffic to main sites and sales as result of the referrals. Also retains the links from magazine to main site (although links are from a sub-domain of the same domain)
Rewrite a brand new version of each magazine article for new NZ site
Rewrite a brand new version of each magazine article for new AU site
(Bearing in mind stringent Panda rules etc – mixing up titles so unique, unique content and posting etc to avoid Panda penalties)
Option 2
Take down magazine site and implement 301 redirects + one new version of the articles.
Move all magazine articles across to the highest performing region (NZ by far) and 301 redirect from NZ magazine to the new NZ site with corresponding articles. 301 redirects take care of the indexed pages to retain traffic and rankings for the NZ magazine articles.
Rewrite a brand new version of each magazine article and add to the new AU site and 301 redirect from AU magazine articles to the new version on AU site. 301 redirects take care of any indexed AU magazine articles...but there may be some fluctuation in rankings as the content is now completely different (brand new).
Could there be any issue with loss of the internal backlinks? impacts SEO strength that magazine subdomain to main site might give?
Other Options?
Appreciate any thoughts or comments... thanks in advance...
-
I would steer clear of removing 250 blog posts from the other web properties. They may be driving traffic to those websites.
The client is requesting 250 particular blog posts to be rewritten. This isn't the best content strategy in the world, but that's what you're being asked to do, so the BEST way to handle it is to completely rewrite every post so they are 100% unique.
If you were to remove the blog posts from the other websites and simply post them on the new website, you're running the risk of taking traffic away from the already established websites.
"Would google pick up on the fact that these blogs are already appearing elsewhere on the web and thereby penalise the new site for posting material that is already indexed by Google?" -- Yes, you run the risk of being penalized by Panda with such a large amount of duplicate content. Google wants to rank websites that provide value to visitors. If a website is entirely made up of content that already exists on another website, you're providing no added value to visitors. Again, you could remove the content from the other websites and 301 redirect to the new one.... but you're taking a lot of value away from those websites if you do that.
-
Hi Phillip,
Sorry - I meant to write: Would all of the blogs need to be removed from the website on which they are appearing?
So is the best course of action to have the articles taken off the platforms on which they appear before going ahead and putting them up on the new site?
Also could you explain how the new site might get hit by panda i.e. would google pick up on the fact that these blogs are already appearing elsewhere on the web and thereby penalise the new site for posting material that is already indexed by Google?
Thanks a million Phillip.
-
If you don't make them VERY unique from the originals, the new site won't perform very well. If the new site consists of nothing but 250 blog posts that were already discovered on other websites, you won't get good results. Simply keyword optimizing the posts won't be enough. They should be entirely re-written to avoid potential problems with Panda.
I'm not sure what you mean by this -- Would the articles need to be completely taken by all current publishers?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical: Same content but different countries
I'm building a website that has content made for specific countries. The url format is: MyWebsite.com/<country name="">/</country> Some of the pages for <specific url="">are the same for different countries, the <specific url="">would be the same as well. The only difference would be the <country name="">.</country></specific></specific> How do I deal with canonical issues to avoid Google thinking I'm presenting the same content?
On-Page Optimization | | newbyguy0 -
How to fix duplicate content for homepage and index.html
Hello, I know this probably gets asked quite a lot but I haven't found a recent post about this in 2018 on Moz Q&A, so I thought I would check in and see what the best route/solution for this issue might be. I'm always really worried about making any (potentially bad/wrong) changes to the site, as it's my livelihood, so I'm hoping someone can point me in the right direction. Moz, SEMRush and several other SEO tools are all reporting that I have duplicate content for my homepage and index.html (same identical page). According to Moz, my homepage (without index.html) has PA 29 and index.html has PA 15. They are both showing Status 200. I read that you can either do a 301 redirect or add rel=canonical I currently have a 301 setup for my http to https page and don't have any rel=canonical added to the site/page. What is the best and safest way to get rid of duplicate content and merge the my non index and index.html homepages together these days? I read that both 301 and canonical pass on link juice but I don't know what the best route for me is given what I said above. Thank you for reading, any input is greatly appreciated!
On-Page Optimization | | dreservices0 -
Duplicate page titles and Content in Woocommerce
Hi Guys, I'm new to Moz and really liking it so far!
On-Page Optimization | | jeeyer
I run a eCommerce site on Wordpress + WooCommerce and ofcourse use Yoast for SEO optimalisation I've got a question about my first Crawl report which showed over 600 issues! 😐 I've read that this is something that happens more often (http://moz.com/blog/setup-wordpress-for-seo-success). Most of them are categorized under:
1. Duplicate Page Titles or;
2. Duplicate Page Content. Duplicate Page Titles:
These are almost only: product category pages and product tags. Is this problem beeing solved by giving them the right SEO SERP? I see that a lot of categories don't have a proper SEO SERP set up in yoast! Do I need to add this to clear this issue, or do I need to change the actual Title? And how about the Product tags? Another point (bit more off-topic) I've read here: http://moz.com/community/q/yoast-seo-plugin-to-index-or-not-to-index-categories that it's advised to noindex/follow Categories and Tags but isn't that a wierd idea to do for a eCommerce site?! Duplicate Page Content:
Same goes here almost only Product Categories and product tags that are displayed as duplicate Page content! When I check the results I can click on a blue button for example "+ 17 duplicates" and that shows me (in this case 17 URLS) but they are not related to the fist in any way so not sure where to start here? Thanks for taking the time to help out!
Joost0 -
Duplicate content penalty
when moz crawls my site they say I have 2x the pages that I really have & they say I am being penalized for duplicate content. I know years ago I had my old domain resolve over to my new domain. Its the only thing that makes sense as to the duplicate content but would search engines really penalize me for that? It is technically only on 1 site. My business took a significant sales hit starting early July 2013, I know google did and algorithm update that did have SEO aspects. I need to resolve the problem so I can stay in business
On-Page Optimization | | cheaptubes0 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
Duplicate Content for Spanish & English Product
Hi There, Our company provides training courses and I am looking to provide the Spanish version of a course that we already provide in English. As it is an e-commerce site, our landing page for the English version gives the full description of the course and all related details. Once the course is purchased, a flash based course launches within a player window and the student begins the course. For the Spanish version of the course, my target customers are English speaking supervisors purchasing the course for their Spanish speaking workers. So the landing page will still be in English (just like the English version of the course) with the same basic description, with the only content differences on that page being the inclusion of the fact that this course is in Spanish and a few details around that. The majority of the content on these two separate landing pages will be exactly the same, as the description for the overall course is the same, just that it's presented in a different language, so it needs to be 2 separate products. My fear is that Google will read this as duplicate content and I will be penalized for it. Is this a possibility or will Google know why I set it up this way and not penalize me? If that is a possibility, how should I go about doing this correctly? Thanks!
On-Page Optimization | | NiallTom0 -
Percentage of duplicate content allowable
Can you have ANY duplicate content on a page or will the page get penalized by Google? For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse? If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content? thanks!
On-Page Optimization | | sportstvjobs0