Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content - Blog Rewriting
-
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website.
He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field.
To what extent would I need to rewrite each article so as to avoid duplicating the content?
Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords?
Would the articles need to be completely taken by all current publishers?
Any advice would be greatly appreciated.
-
Hi guys, have a client in a similar situation and working through best option for them...would appreciate any comments or feedback...
Current Status - client has two websites each targeting different countries: .co.nz and .com.au
With the exception of a few products that are offered separately between NZ and AU, the sites are the same. In essence duplicate content. This is due to current platform limitations (the way their web company has built the site it is same site showing in each region on separate domains with option to change products between regions using inventory an integrated inventory tool).
The great news is they are currently rebuilding their websites onto a new platform with two unique versions of the site…which will be great for ongoing SEO - ie we can really drill into creating separate sets of page, product, template content and meta data etc.
They also have a magazine running on Word Press Blog using sub-domains associated with the regional root domain. E.g.
magazine.domain.co.nz and magazine.domainname.com.auAgain, with a few exceptions, this is also duplicated for both countries…ie sub domains assigned to the same site. Again duplicate content.
Question: The magazine being built on Word Press has to date been geared at offering an “FAQ” type engagement with visitors....visitors can submit questions via module which are then answered in Word Press blog posts. There are also links from main site menu away to the magazine...so not ideal for conversion. Client wants to bring this FAQ type feature back to the two main sites and can now do so during new site rebuilds.
There is also some SEO juice in the magazine as in essence it is a large Word Press blog. I am trying to work through what would be the best option for transferring all of the FAQ answers/articles (content) from magazine FAQs to the two new main sites...so over time the two new main sites obtain that SEO strength.
Option 1
Leave magazine as it is so that main sites continue to get benefits of referral traffic to main sites and sales as result of the referrals. Also retains the links from magazine to main site (although links are from a sub-domain of the same domain)
Rewrite a brand new version of each magazine article for new NZ site
Rewrite a brand new version of each magazine article for new AU site
(Bearing in mind stringent Panda rules etc – mixing up titles so unique, unique content and posting etc to avoid Panda penalties)
Option 2
Take down magazine site and implement 301 redirects + one new version of the articles.
Move all magazine articles across to the highest performing region (NZ by far) and 301 redirect from NZ magazine to the new NZ site with corresponding articles. 301 redirects take care of the indexed pages to retain traffic and rankings for the NZ magazine articles.
Rewrite a brand new version of each magazine article and add to the new AU site and 301 redirect from AU magazine articles to the new version on AU site. 301 redirects take care of any indexed AU magazine articles...but there may be some fluctuation in rankings as the content is now completely different (brand new).
Could there be any issue with loss of the internal backlinks? impacts SEO strength that magazine subdomain to main site might give?
Other Options?
Appreciate any thoughts or comments... thanks in advance...
-
I would steer clear of removing 250 blog posts from the other web properties. They may be driving traffic to those websites.
The client is requesting 250 particular blog posts to be rewritten. This isn't the best content strategy in the world, but that's what you're being asked to do, so the BEST way to handle it is to completely rewrite every post so they are 100% unique.
If you were to remove the blog posts from the other websites and simply post them on the new website, you're running the risk of taking traffic away from the already established websites.
"Would google pick up on the fact that these blogs are already appearing elsewhere on the web and thereby penalise the new site for posting material that is already indexed by Google?" -- Yes, you run the risk of being penalized by Panda with such a large amount of duplicate content. Google wants to rank websites that provide value to visitors. If a website is entirely made up of content that already exists on another website, you're providing no added value to visitors. Again, you could remove the content from the other websites and 301 redirect to the new one.... but you're taking a lot of value away from those websites if you do that.
-
Hi Phillip,
Sorry - I meant to write: Would all of the blogs need to be removed from the website on which they are appearing?
So is the best course of action to have the articles taken off the platforms on which they appear before going ahead and putting them up on the new site?
Also could you explain how the new site might get hit by panda i.e. would google pick up on the fact that these blogs are already appearing elsewhere on the web and thereby penalise the new site for posting material that is already indexed by Google?
Thanks a million Phillip.
-
If you don't make them VERY unique from the originals, the new site won't perform very well. If the new site consists of nothing but 250 blog posts that were already discovered on other websites, you won't get good results. Simply keyword optimizing the posts won't be enough. They should be entirely re-written to avoid potential problems with Panda.
I'm not sure what you mean by this -- Would the articles need to be completely taken by all current publishers?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does hover over content index well
i notice increasing cases of portfolio style boxes on site designs (especially wordpress templates) where you have an image and text appears after hover over (sorry for my basic terminology). does this text which appears after hover over have much search engine value or as it doesnt immediately appear on pageload does it carry slightly less weight like tabbed content? any advice appreciated thanks neil
On-Page Optimization | | neilhenderson0 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
Duplicate Content from on Competitor's site?
I've recently discovered large blocks of content on a competitors site that has been copy and pasted from a client's site. From what I know, this will only hurt the competitor and not my client since my guy was the original. Is this true? Is there any risk to my client? Should we take action? Dino
On-Page Optimization | | Dino640 -
Duplicate eCommerce Product Descriptions
I know that creating original product descriptions is best practices. What I don't understand is how other sites are able to generate significant traffic while still using duplicate product descriptions on all product pages. How are they not being penalized by Google?
On-Page Optimization | | mj7750 -
Sliders and Content Above the Fold
I was just inspecting a wire frame that is going out to a client and realized that the slider may interfere with the "content above the fold." Can't believe this had not struck me on others. If the Header has basic business info, etc. in it and you place a slider to display images in the area just beneath the Header or slightly down from it, does that decrease the amount of content seen a being above the fold? Or, is content above the fold established by virtue of H1,2, 3, etc.?
On-Page Optimization | | RobertFisher0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5