How do you properly handle syndicated content?
-
The same piece of content is pulled in and presented (syndicated) within a frame on different web sites (owned by the same company). However, I would like only one web site to rank on Google's search results for that content. How do I set this up?
Thanks,
claudia
-
I can't imagine myself running that many sites with the same content. My approach would be to make one big asskicking site that would beat all of the competitors in the various markets.
But if you are going to have multiple copies out there... I would write the Bible about a topic and place it on my main site.... and the other sites would have summary articles that promote the Bible that can be found on the main site. I would use rel="canonical" on the small sites as described by Ryan.
-
4-8 sites depending on the content.
-
How many websites are we talking about?
-
Hi Claudia.
You have a couple options. On the sites you do not wish to rank, you can add the "noindex, follow" meta tag to the page or you can add the canonical tag to the page pointing back to the original content.
The proper tag would be placed in the tags of the HTML on your page.
<link rel="canonical" href="http://www.mysite.com/original-article-link" />
You mentioned the content is presented "in a frame". Normally web crawlers cannot see content in iframes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with Duplicate Content
When I crawl my site through moz, it shows lots of Pages with Duplicate Content. The thing is all that pages are pagination pages. How should I solve this issue?
Technical SEO | | 100offdeal0 -
What is the best way to handle these duplicate page content errors?
MOZ reports these as duplicate page content errors and I'm not sure the best way to handle it. Home
Technical SEO | | ElykInnovation
http://myhjhome.com/
http://myhjhome.com/index.php Blog
http://myhjhome.com/blog/
http://myhjhome.com/blog/?author=1 Should I just create 301 redirects for these? 301 http://myhjhome.com/index.php to http://myhjhome.com/ ? 301 http://myhjhome.com/blog/?author=1 to http://myhjhome.com/ ? Or is there a better way to handle this type of duplicate page content errors? and0 -
Duplicate Content Reports
Hi Dupe content reports for a new client are sjhowing very high numbers (8000+) main of them seem to be for sign in, register, & login type pages, is this a scenario where best course of action to resolve is likely to be via the parameter handling tool in GWT ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Need Advice: How should we handle this situation?
Hi Folks, We have a blog post on one of our sites that ranked very highly for lucrative term for about a period of two months. It had over 2000 Facebook likes, about 20 tweets and the same amount of Google +1's. The post ended up receiving several high quality natural links, and we also pointed a few authoritative links to it from our network of sites. After we saw the ranking starting to slip we did a bit of link building (which we shouldn't have done) and ended up making a big mistake. The link building company was only supposed to do 30 links and they ended up doing 600. Once we figured it out, we immediately submitted a disavow request and told Google about our mistake. I also thought maybe we then had a manual spam penalty applied so I also submitted a reconsideration request (and also told them about our mistake) but got back a canned reply saying "no manual penalties" were found. After we did all that, we saw the rankings fall out of the top 50 with the next 10 days. I'm confident we can throw up a new similar blog post and see close the same rankings we experienced with the original post. But before I do that, I have two questions: Should we 301 the old post to the new post? Could that some how "pass" the bad rankings along to the new post? What should we do about the natural links we received? Should we try and reach out to the sites and get them to change their links to the new post? Any help would be appreciated. Thanks!
Technical SEO | | shawn810 -
Duplicate page content - index.html
Roger is reporting duplicate page content for my domain name and www.mydomain name/index.html. Example: www.just-insulation.com
Technical SEO | | Collie
www.just-insulation.com/index.html What am I doing wrongly, please?0 -
Duplicate page content
Hello, My site is being checked for errors by the PRO dashboard thing you get here and some odd duplicate content errors have appeared. Every page has a duplicate because you can see the page and the page/~username so... www.short-hairstyles.com is the same as www.short-hairstyles.com/~wwwshor I don't know if this is a problem or how the crawler found this (i'm sure I have never linked to it). But I'd like to know how to prevent it in case it is a problem if anyone knows please? Ian
Technical SEO | | jwdl0 -
How critical is Duplicate content warnings?
Hi, So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ. However, the crawl report returned thousands of errors and most of them are duplicate content warnings. As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know). I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler) So my question is, should I be worried about the thousands of error messages in crawler diagnostics? any ideas appreciated
Technical SEO | | Gamer070 -
Noindex all dodgy content?
Hello should I be brutal with noindex? should I noindex anything of no value to websurfers? from my understanding, nofollow is different to to noindex? Google follows through the site crawling and discovering subpages but will not put the noindexed page in serps. Is that right? I have subcategory pages in a business directory site, these pages just have links to there subpages.
Technical SEO | | adamzski1