Am I Syndicating Content Correctly?
-
My question is about how to syndicate content correctly. Our site has professionally written content aimed toward our readers, not search engines. As a result, we have other related websites who are looking to syndicate our content. I have read the Google duplicate content guidelines (https://support.google.com/webmasters/answer/66359?hl=en), canonical recommendations (https://support.google.com/webmasters/answer/139066?hl=en&ref_topic=2371375), and no index recommendation (https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag) offered by Google, but am still a little confused about how to proceed. The pros in our opinion are as follows:#1 We can gain exposure to a new audience as well as help grow our brand #2 We figure its also a good way to help build up credible links and help our rankings in GoogleOur initial reaction is to have them use a "canonical link" to assign the content back to us, but also implement a "no index, follow" tag to help avoid duplicate content issues. Are we doing this correctly, or are we potentially in threat of violating some sort of Google Quality Guideline?Thanks!
-
No, you will not receive any increase in your pagerank as a result.
Having said that, if the other website did NOT include the canonical link then there is a chance the link juice for the page would either be split equally between your site and their site or worse case it will all be given to their site (if Google thinks that they are the originator)! So indirectly, ensuring that they add the canonical tag will result in your page having a better ranking.
Hope that makes sense!
Steve
-
Thanks for taking the time to answer my questions. I do have a follow up though... With the "canonical" and "no index, follow" tags in place, will any link juice be transferred?
For example:
Original article is published on www.mysite.com/original-article
Content is syndicated on www.theresite.com/syndicated-content with the following tags in place:
What I am getting confused about is since the syndicated content is not getting index, then does any sort of link attributes get passed through to my original article? In other words, does the canonical link pass any link juice even though the noindex tag is in place?
-
However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article.
Yes, but you gotta be really careful. If you fill syndicated content with anchor text links you will have a Penguin problem.
** Wondering if this was written before Penguin. ** If I was the boss at Google we would have a bar of soap used to wash the mouth of Googlers who talk about link building.
-
**Our initial reaction is to have them use a "canonical link" to assign the content back to us, but also implement a "no index, follow" tag to help avoid duplicate content issues. **
This is the way to go. But, you must require them to use the canonical and the no index. You gotta say, "These are our conditions for your use of our content." If they are good guys then they should have no problem with it. Stick to your guns about this.
My bet is that some will simply rewrite your content.
-
Hi,
I would stipulate that anyone wishing to re-using your content does so on the condition that they include a canonical link back to your original article... Even if a few people do this then Google will soon realise that you are the author of the original article and credit you with the associated pagerank.
You should never look to create content solely for search engines (so you're doing the right thing). Website content should always be about your users but if you do this correctly then you will also benefit from the traffic the search engines generate!
Hope this helps.
Steve
-
Hi Brad,
Google's official version below:
- Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.
You can refer to it on this link
Cheers,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cross Domain Duplicate Content
Hi, We want create 2 company websites and each to be targeted specific to different countries. The 2 countries are Australia and New Zealand. We have acquired 2 domains, company.com.au and company.co.nz . We want to do it like this and not use different hreflang on the same version for maximum ranking results in each country (correct?). Since both websites will be in English, inevitably some page are going to be the same. Are we facing any danger of duplicate content between the two sites, and if we do is there any solution for that? Thank you for your help!
White Hat / Black Hat SEO | | Tz_Seo0 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
Top authors for ecommerce content
Hello, What are some tips that you recommend for someone looking to hire an expert to write or consult in a piece of content. It's as general a keyword as our niche has and it's the only keyword that's actually inside the niche that has any decent level of backlinks. We're considering searching out an expert in our field that knows more about the subject than our people do even though our people are knowledgable. Trying to come from authority. Your recommendations in the process of coming up with a great piece of content from a good authority?
White Hat / Black Hat SEO | | BobGW0 -
Please Correct This on-site SEO strategy w/ respect to all the updates
Hello, I believe my on-site SEO process that I used to use a couple of years ago is not working well anymore for a couple of my sites, including this one. I'll tell you the old strategy as well as my new strategy and I'm wondering if you can give me pointers that will help us rank where we should rank with our PA and DA instead of getting moved down because of what could be our old on-site SEO. OLD ON-SITE SEO STRATEGY: Title tags usually match the page, but title tags occasionally on this site don't match the pages exactly. There's not many of them, but they do still exist in a couple of places. Title tags are either 1. A phrase describing the page 2. Keywords 1, Keyword 2 3. Keyword 1 | Keyword 2 4. Keywords 1, Keyword 2, branding The keywords are in the h1 and h2 of each main page, at the very top of the page. The h1 and h2 do not exactly copy the title tag, but are a longer phrase with the keywords appearing in their exact word order or in word variations. See this page for an example. Keywords occur 3-4 times in the body of the main pages (the pages with a menu link). Right now some of the pages have the exact phrases 3 or 4 times and no variation. meta description tags have exact keyword phrases once per keyword. Meta description tag are a short paragraph describing the page. No meta keyword tags, but a couple haven't been deleted yet. FUTURE ON-SITE SEO STRATEGY: I'm going to change all of the page titles to make sure they match the content they're on exactly. If the title is a phrase describing a page, I'm going to make sure a variation of that phrase occurs at least three times in the content, and once in the meta description tag. Title tags will be either a. Short phrase exactly matching page b. Keyword 1, Keyword 2 | branding c. Keyword 1 | branding 2. I'm thinking about taking out the H1 and H2 and replacing them with one tag that is a phrase describing the page that I'll sometimes put the keyword phrase in, only a variation in it and not the exact keyword phrase - unless it just makes total sense to use the keyword phrase exactly. **I'm thinking of only using the keyword phrase in it's exact words once on the page unless it occurs more naturally, and to include the keyword phrase in word variations two more times. So once (in non-exact word order) in the at the top, once (exact word order) in the text, and two more times (varied word orders) somewhere in the text. All this will be different if the keywords show up naturally in the text. **3. I'll delete all meta keyword tags, and still use exact keyword phrases in meta description tag, though I'll change the meta description tags to always very closely match what the page is about. Do you think my new strategy will make a difference? Your thoughts on any of this?****
White Hat / Black Hat SEO | | BobGW0 -
Is it still valuable to place content in subdirectories to represent hierarchy or is it better to have every URL off the root?
Is it still valuable to place content in subdirectories to represent hierarchy on the site or is it better to have every URL off the root? I have seen websites structured both ways. It seems having everything off the root would dilute the value associated with pages closest to the homepage. Also, from a user perspective, I see the value in a visual hierarchy in the URL.
White Hat / Black Hat SEO | | belcaro19860 -
What are your views on recent statements regarding "advertorial" content?
Hi, Recently, there's been a lot said and written about how Google is going to come down hard on 'advertorial' content. Many B2B publishers provide exposure to their clients by creating and publishing content about them -----based on information/ content obtained from clients (for example, in the form of press releases) or compiled by the publisher. From a target audience/ user perspective, this is useful information that the publication is bringing to its audience. Also, let's say the publishers don't link directly to client websites. In such a case, how do you think Google is likely to look at publisher websites in the context of the recent statements related to 'advertorial' type content? Look forward to views of the Moz community. Thanks, Manoj
White Hat / Black Hat SEO | | ontarget-media0 -
Duplicate Content due to Panda update!
I can see that a lot of you are worrying about this new Panda update just as I am! I have such a headache trying to figure this one out, can any of you help me? I have thousands of pages that are "duplicate content" which I just can't for the life of me see how... take these two for example: http://www.eteach.com/Employer.aspx?EmpNo=18753 http://www.eteach.com/Employer.aspx?EmpNo=31241 My campaign crawler is telling me these are duplicate content pages because of the same title (which that I can see) and because of the content (which I can't see). Can anyone see how Google is interpreting these two pages as duplicate content?? Stupid Panda!
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0