Duplicate Articles
-
We submit articles to a magazine which either get posted as text or in a flash container. Management would like to post it to our site as well. I'm sure this has been asked a million times but is this a bad thing to do? Do I need to a rel=canonical tag to the articles? Most of the articles posted to that other site do not contain a link back to our site.
-
The magazine has already given us the ok, like I said they're much more offline focused so it's more about what Google thinks. I think I agree about playing it safe with the canonical tag though. Thanks!
-
If it's really just for your own reference or limited use, I'd probably set up the cross-domain canonical and keep it off of Google's radar. Later, if you wanted to self-publish, you could remove that.
If it's just your site and theirs, it's probably not a high-risk situation. In some ways, it's more about the relationship. If your pages started ranking instead of theirs, I don't know if that goes against your general agreement with them. I'd probably play it safe for now.
-
Our site doesn't have the largest audience yet but management simply wants a place they can go or send clients to easily find everything in one place. The magazine is more for offline advertising but they post it online as well.
-
I'd just add to what Jason said, which I think is generally on-target. If the magazine really is the "source", then posting all those articles again on your site could look "thin" to both users and search engines. In general, you're not ranking for them now, so you probably won't lose out, from an SEO standpoint. There is some risk if you copy a lot of articles, though. You don't want to look like you're scraping your own content, in essence.
The cross-domain rel-canonical should remove the risk of any sort of search penalty or problems. So, again, it's a question of whether it provides value to your site.
At some point, you have to ask - would it make sense to only post them on your site? In other words, if you're building an audience, does it make sense to build it for someone else? Granted, that's a much larger business and marketing decision (far beyond SEO).
-
It's nots a "bad" thing to post the articles in two places, as this type of syndication is somewhat commonplace in the corporate world. Provided your site already as a lot of content and is generally good quality, there's no risk of a penalty for syndicating content.
However, I would encourage management to look at it from the user's perspective: If the user reads the article in the magazine, they're not going to find it very useful to see the same article again on your site. Conversely, if your website visitors aren't going to see the article in the magazine first, why send it to the magazine at all?
One solution is to quote a snippet of the original magazine article on your site, and then write a 200+ word summary or intro for the magazine article that perhaps summarizes the key points, introduces the article in a different way, etc., and then links to the magazine.
From a user's perspective, all the content you've published on your site and in the magazine is unique and potentially useful. From the SEO perspective, there's no possibility of an issue and - unlike syndication - you're adding a unique page of content to your site that is highly likely to be indexed and help you in the long run.
Syndication isn't bad, but you have to ask why you're doing it in the first place. It's often just as easy to create a short "What You'll Learn In This Article" intro on your site than it is to cut-and-paste.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question regarding subdomains and duplicate content
Hey everyone, I have another question regarding duplicate content. We are planning on launching a new sector in our industry to satisfy a niche. Our main site works as a directory with listings with NAP. The new sector that we are launching will be taking all of the content on the main site and duplicating it on a subdomain for the new sector. We still want the subdomain to rank organically, but I'm having struggles between putting a rel=canonical back to main site, or doing a self-referencing canonical, but now I have duplicates. The other idea is to rewrite the content on each listing so that the menu items are still the same, but the listing description is different. Do you think this would be enough differentiating content that it won't be seen as a duplicate? Obviously make this to be part of the main site is the best option, but we can't do that unfortunately. Last question, what are the advantages or disadvantages of doing a subdomain?
White Hat / Black Hat SEO | | imjonny0 -
Wordpress Category Archives - Index - but will this cause duplication?
Okay something I am struggling with Using YOAST - but have a recipe blog - However the category archives have /are being optimized and indexed as I am adding custom content to them , then listing the recipes below. My question is if I am indexing the Category Archives and using these to add custom content above - then allows the recipe excerpts from the category to be listed underneath - will these recipe excerpts be picked up as duplicate content?
White Hat / Black Hat SEO | | Kelly33300 -
Is article syndication still a safe & effective method of link building?
Hello, We have an SEO agency pushing to implement article syndication as a method of link building. They claim to only target industry-relevant, high authority sources. I am very skeptical of this tactic but they are a fairly reputable agency and claim this is safe and works for their other clients. They sent a broadly written (but not trash) article, as well as a short list of places they would syndicate the article on, such as issuu.com and scribd.com. These are high authority sites and I don't believe I've heard of any algo updates targeting them. Regarding linking, they said they usually put them in article descriptions and company bylines, using branded exact and partial matches; so the anchor text contains exact or partial keywords but also contains our brand name. Lately, I have been under the impression that the only "safe" links that have been manually built, such as these, should be either branded or simply your site's URL. Does anyone still use article syndication as a form of link building with success? Do you see any red flags here? Thanks!
White Hat / Black Hat SEO | | David_Veldt0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Commenting on blogs articles
Hi All, I have joined a new company and I am supposed to post relevant comments to blog articles. In the comment I want to provide them the source like www.example.com example.com example which of the above 3 will give me the maximum benefit with the backlinking.
White Hat / Black Hat SEO | | TranswebGlobal0 -
How can do I report a multiple set of duplicated websites design to manipulate SERPs?
Ok, so within one of my client's sectors it has become clear that someone is trying to manipulate the SERPs by registering tons of domains that are all keyword targeted. All of the websites are simply duplications of one another and are merely setup to dominate the SERP listings - which, at the moment, it is beginning to do. None of the sites have any real authority (in some cases 1 PA and DA) and yet they're ranking above much more established websites. The only back links they have are from dodgy-looking forum ones. It's all a bit crazy and it shouldn't be happening. Anyway, all of the domains have been registered by the same person and within a two-month time period of each other. What do you guys think is the best step to take to report these particular websites to Google?
White Hat / Black Hat SEO | | Webrevolve0 -
Duplicate content showing on local pages
I have several pages which are showing duplicate content on my site for web design. As its a very competitive market I had create some local pages so I rank high if someone is searching locally i.e web design birmingham, web design tamworth etc.. http://www.cocoonfxmedia.co.uk/web-design.html http://www.cocoonfxmedia.co.uk/web-design-tamworth.html http://www.cocoonfxmedia.co.uk/web-design-lichfield.html I am trying to work out what is the best way reduce the duplicate content. What would be the best way to remove the duplicate content? 1. 301 redirect (will I lose the existing page) to my main web design page with the geographic areas mentioned. 2. Re write the wording on each page and make it unique? Any assistance is much appreciated.
White Hat / Black Hat SEO | | Cocoonfxmedia0 -
Will cleaning up old pr articles help serps?
For a few years we published articles with anchor text backlinks to about 10 different article submission sites. Each article was modified to create similar different articles. We have about 50 completely unique articles. This worked really well for our serps until google panda & penguin updates. I am looking for advice on whether I should have a major clean up of the published articles and if so should I be deleting them, removing or renaming anchor text backlinks? Any advice on what strategy would work best would be appreciated as I don't want to start deleting backlinks and making it worse. We used to enjoy position 1 but are now at 12-15 so have least most of our traffic.
White Hat / Black Hat SEO | | devoted2vintage0