Am I Syndicating Content Correctly?
-
My question is about how to syndicate content correctly. Our site has professionally written content aimed toward our readers, not search engines. As a result, we have other related websites who are looking to syndicate our content. I have read the Google duplicate content guidelines (https://support.google.com/webmasters/answer/66359?hl=en), canonical recommendations (https://support.google.com/webmasters/answer/139066?hl=en&ref_topic=2371375), and no index recommendation (https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag) offered by Google, but am still a little confused about how to proceed. The pros in our opinion are as follows:#1 We can gain exposure to a new audience as well as help grow our brand #2 We figure its also a good way to help build up credible links and help our rankings in GoogleOur initial reaction is to have them use a "canonical link" to assign the content back to us, but also implement a "no index, follow" tag to help avoid duplicate content issues. Are we doing this correctly, or are we potentially in threat of violating some sort of Google Quality Guideline?Thanks!
-
No, you will not receive any increase in your pagerank as a result.
Having said that, if the other website did NOT include the canonical link then there is a chance the link juice for the page would either be split equally between your site and their site or worse case it will all be given to their site (if Google thinks that they are the originator)! So indirectly, ensuring that they add the canonical tag will result in your page having a better ranking.
Hope that makes sense!
Steve
-
Thanks for taking the time to answer my questions. I do have a follow up though... With the "canonical" and "no index, follow" tags in place, will any link juice be transferred?
For example:
Original article is published on www.mysite.com/original-article
Content is syndicated on www.theresite.com/syndicated-content with the following tags in place:
What I am getting confused about is since the syndicated content is not getting index, then does any sort of link attributes get passed through to my original article? In other words, does the canonical link pass any link juice even though the noindex tag is in place?
-
However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article.
Yes, but you gotta be really careful. If you fill syndicated content with anchor text links you will have a Penguin problem.
** Wondering if this was written before Penguin. ** If I was the boss at Google we would have a bar of soap used to wash the mouth of Googlers who talk about link building.
-
**Our initial reaction is to have them use a "canonical link" to assign the content back to us, but also implement a "no index, follow" tag to help avoid duplicate content issues. **
This is the way to go. But, you must require them to use the canonical and the no index. You gotta say, "These are our conditions for your use of our content." If they are good guys then they should have no problem with it. Stick to your guns about this.
My bet is that some will simply rewrite your content.
-
Hi,
I would stipulate that anyone wishing to re-using your content does so on the condition that they include a canonical link back to your original article... Even if a few people do this then Google will soon realise that you are the author of the original article and credit you with the associated pagerank.
You should never look to create content solely for search engines (so you're doing the right thing). Website content should always be about your users but if you do this correctly then you will also benefit from the traffic the search engines generate!
Hope this helps.
Steve
-
Hi Brad,
Google's official version below:
- Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.
You can refer to it on this link
Cheers,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we get de-indexed for changing some content and tags frequently? What is the scope in 2017?
Hi all, We are making some changes in our website content at some paragraphs and tags with our main keywords. I'm just wondering if this is going to make us de indexed from Google? Because we recently dropped in rankings when we added some new content; so I am worried whether there are any chances it will turn more risky when we try to make anymore changes like changing the content. There are actually many reasons a website gets de indexed from Google but we don't employ any such black hat techniques. Our website got a reputation with thousands of direct traffic and organic search. However I am curious to know what are the chances of getting de indexed as per the new trends at Google? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Technical : Duplicate content and domain name change
Hi guys, So, this is a tricky one. My server team just made quite a big mistake :We are a big We are a big magento ecommerce website, selling well, with about 6000 products. And we are about to change our domaine name for administrative reasons. Let's call the current site : current.com and the future one : future.com Right, here is the issue Connecting to the search console, I saw future.com sending 11.000 links to current.com. At the same time DA was hit by 7 points. I realized future.com was uncorrectly redirected and showed a duplicated site or current.com. We corrected this, and future.com now shows a landing page until we make the domain name change. I was wondering what is the best way to avoid the penalty now and what can be the consequences when changing domain name. Should I set an alias on search console or something ? Thanks
White Hat / Black Hat SEO | | Kepass0 -
Disabling a slider with content...is considered cloaking?
We have a slider on our site www.cannontrading.com, but the owner didn't like it, so I disabled it. And, each slider contains link & content as well. We had another SEO guy tell me it considered cloaking. Is this True? Please give feedbacks.
White Hat / Black Hat SEO | | ACann0 -
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Please Correct This on-site SEO strategy w/ respect to all the updates
Hello, I believe my on-site SEO process that I used to use a couple of years ago is not working well anymore for a couple of my sites, including this one. I'll tell you the old strategy as well as my new strategy and I'm wondering if you can give me pointers that will help us rank where we should rank with our PA and DA instead of getting moved down because of what could be our old on-site SEO. OLD ON-SITE SEO STRATEGY: Title tags usually match the page, but title tags occasionally on this site don't match the pages exactly. There's not many of them, but they do still exist in a couple of places. Title tags are either 1. A phrase describing the page 2. Keywords 1, Keyword 2 3. Keyword 1 | Keyword 2 4. Keywords 1, Keyword 2, branding The keywords are in the h1 and h2 of each main page, at the very top of the page. The h1 and h2 do not exactly copy the title tag, but are a longer phrase with the keywords appearing in their exact word order or in word variations. See this page for an example. Keywords occur 3-4 times in the body of the main pages (the pages with a menu link). Right now some of the pages have the exact phrases 3 or 4 times and no variation. meta description tags have exact keyword phrases once per keyword. Meta description tag are a short paragraph describing the page. No meta keyword tags, but a couple haven't been deleted yet. FUTURE ON-SITE SEO STRATEGY: I'm going to change all of the page titles to make sure they match the content they're on exactly. If the title is a phrase describing a page, I'm going to make sure a variation of that phrase occurs at least three times in the content, and once in the meta description tag. Title tags will be either a. Short phrase exactly matching page b. Keyword 1, Keyword 2 | branding c. Keyword 1 | branding 2. I'm thinking about taking out the H1 and H2 and replacing them with one tag that is a phrase describing the page that I'll sometimes put the keyword phrase in, only a variation in it and not the exact keyword phrase - unless it just makes total sense to use the keyword phrase exactly. **I'm thinking of only using the keyword phrase in it's exact words once on the page unless it occurs more naturally, and to include the keyword phrase in word variations two more times. So once (in non-exact word order) in the at the top, once (exact word order) in the text, and two more times (varied word orders) somewhere in the text. All this will be different if the keywords show up naturally in the text. **3. I'll delete all meta keyword tags, and still use exact keyword phrases in meta description tag, though I'll change the meta description tags to always very closely match what the page is about. Do you think my new strategy will make a difference? Your thoughts on any of this?****
White Hat / Black Hat SEO | | BobGW0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Duplicate Content due to Panda update!
I can see that a lot of you are worrying about this new Panda update just as I am! I have such a headache trying to figure this one out, can any of you help me? I have thousands of pages that are "duplicate content" which I just can't for the life of me see how... take these two for example: http://www.eteach.com/Employer.aspx?EmpNo=18753 http://www.eteach.com/Employer.aspx?EmpNo=31241 My campaign crawler is telling me these are duplicate content pages because of the same title (which that I can see) and because of the content (which I can't see). Can anyone see how Google is interpreting these two pages as duplicate content?? Stupid Panda!
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0