Is Syndicated (Duplicate) Content considered Fresh Content?
-
Hi all,
I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain?
An example may clearly show what I'm after:
domain1.com is a lawyer in Seattle.
domain2.com is a lawyer in New York.Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value?
Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains).
Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well.
We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO.
Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain.
TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain?
Thanks so much,
Cole
-
Hi all,
Thanks for the responses & feedback.
Alan, in this example, the fresh content would be relevant. Of course there are search queries that don't need freshness or updates, but I would argue most do need updates / freshness (even the ones we think we know the answer to over time).Once again, the conversation is not about RANKING for that page but about HELPING the domain achieve "freshness & relevance" around a topic with that duplicate content.
Would love to see others chime in.
Thanks,
Cole
-
Well that could mean that some don't need any.
Like
Q. Who discovered Australia, A. Captain Cook.
This does not need freshness.Also consider being original content, in that case the timestamp being older would be better.
I like to think that I own google, and say to myself would I rank it? of cause some things may rank that were not intended to, but I think its quite safe to think that way.
-
This was the part that triggered me:
"Google Fellow Amit Singhal explains that “Dif__ferent searches have different freshness needs.”
The implication is that Google measures all of your documents for freshness, then scores each page according to the type of search query."
-
Had a quick look at that page, did not see that it affects all pages. Anyhow google said 35% of queries, so could not be all pages.
Some points- Why would fresh data be excluded from duplicate content?
- Is it likely that syndicated data is fresh?
- What are google trying to do here, rank syndicated duplicate data?
I cant see it working
-
Thanks a lot! Kinda made me realize I really should read some more about this update. Might be off topic, but what's your view on freshness applied to **all **pages. In this Whiteboard Friday its stated it only impacts the terms you describe:
http://moz.com/blog/googles-freshness-update-whiteboard-friday
But in this blogpost of that time (before the sum up) it’s stated that it’s applied to all pages, but does affect search queries in different ways:
-
Yes, freshness update was not for all queries, it was for certain queries that need fresh content such as football scores, or whose on the team this week, obviously we don't want the score from last year or who is playing last year we want the current data, that is where the freshness update may give you a boost while your content is fresh. Having syndicated content I cant see falling into this category, even if it did, being duplicate content would mean that only once source is going to rank.
Also you have to look at indexing, will the duplicate content even be indexed? if so how often.
That's why I say the short answer is no.
-
Hi Alan,
Is there any source / own research that can back up this answer?
Would love to read more about this subject!
-
Short answer, NO
-
Thanks for your feedback Mike - definitely helpful!
In this hypothetical, we're looking at research or comprehensive articles for specific niches that could serve multiple businesses well as an authority.
Thanks,
Cole
-
Hi Cole,
Fresh by Google (if not noindexed) in this case would be kind of like the freshness value of a "fresh" error.
Maybe that's extreme, but point being, the content is not needed by the web, since it already exists. If there was absolutely nothing else being added to or changed about the site and my one option was adding duplicate content, I'd noindex/follow it and figure I might have gotten some small, small, small benefit from updating the site a little, maybe an improved user signal. I'd for sure keep it out of the index. I guess that's how I'd do it, if it had some value for visitors. If it's only value was adding something fresh and not that great for visitors, I'd find the extra hour necessary to re-write it into something fresh, unique and valued by visitors. .
The other thing about syndicated content is that after you make sure where else you can find it on the web via an exact phrase search in Google, it may not mean you've seen the only instance of it as it may evolve. Having duplicate content indexed with other sites of possibly low quality may put you in a bad neighborhood as sites with common content. If I had a ten foot pole, I wouldn't touch it with it.
I hope that helps. Best... Mike
-
Hi Mike,
Thanks for the feedback. That was one potential point I was making.
Am still curious if duplicate content would be considered "fresh" within a website. Good point of the duplicate content overriding the benefit of fresh content.
Thanks,
Cole
-
In phrasing the question as "is it considered fresh/unique," I'm going to assume you mean by google for the site's organic benefit. So, I guess the reasoning would be is the fact that it's fresh to the site a bigger positive than the negative of duplicate content. Is that what you're getting at? Personally, knowingly on-boarding duplicate content would be too big of a potential negative for me to consider doing it. I've done it as a noindex/follow for reasons other than Google, but not for some mystery freshness bump.
Not that you can't find examples of duplicate content ranking in more than one place. To me on-boarding indexed duplicate content seems like just asking for trouble.
Hope that helps. Best... Mike
-
I'm curious to see what others have to say on this, but I've always assumed that "fresh" and "unique" go hand in hand when it comes to website content. Therefore, duplicate content would not be fresh content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Somebody took an article from my site and posted it on there own site but gave it credit back to my site is this duplicate content?
Hey guys, This question may sound a bit drunk, but someone copied our article and re-posted it on their site the exact article, however the article was credited to our site and the original author of the article had approved the other site could do this. We created the article first though, Will this still be regarded as duplicate content? The owner of the other site has told us it wasn't because they credited it. Any advice would be awesome Thanks
White Hat / Black Hat SEO | | edward-may0 -
Competitor ranking well with duplicate content—what are my options?
A competitor is ranking #1 and #3 for a search term (see attached) by publishing two separate sites with the same content. They've modified the title of the page, and serve it in a different design, but are using their branded domain and a keyword-rich domain to gain multiple rankings. This has been going on for years, and I've always told myself that Google would eventually catch it with an algorithm update, but that doesn't seem to be happening. Does anyone know of other options? It doesn't seem like this falls under any of the categories that Google lists on their web spam report page—is there any other way to get bring this up with the powers that be, or is it something that I just have to live with and hope that Google figures out some day? Any advice would help. Thanks! how_to_become_a_home_inspector_-_Google_Search_2015-01-15_18-45-06.jpg
White Hat / Black Hat SEO | | inxilpro0 -
Duplicate content for product pages
Say you have two separate pages, each featuring a different product. They have so many common features, that their content is virtually duplicated when you get to the bullets to break it all down. To avoid a penalty, is it advised to paraphrase? It seems to me it would benefit the user to see it all laid out the same, apples to apples. Thanks. I've considered combining the products on one page, but will be examining the data to see if there's a lost benefit to not having separate pages. Ditto for just not indexing the one that I suspect may not have much traction (requesting data to see).
White Hat / Black Hat SEO | | SSFCU0 -
Is it still valuable to place content in subdirectories to represent hierarchy or is it better to have every URL off the root?
Is it still valuable to place content in subdirectories to represent hierarchy on the site or is it better to have every URL off the root? I have seen websites structured both ways. It seems having everything off the root would dilute the value associated with pages closest to the homepage. Also, from a user perspective, I see the value in a visual hierarchy in the URL.
White Hat / Black Hat SEO | | belcaro19860 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Is using Zeus's gateway feature to display contents from the different URL OK to do?
I've been writing a blog on free hosting blog platform and planning to migrate that under my domain name as directory. myblog.ABCD.com to www.mydomain.com/myblog now, I've learned that my Zeus server has a way to show myblog.ABCD.com at mydomain.com/myblog without transferring anything by using the Gateway feature. This will save a lot of time and hassle for me, but my question is if this is ok to do?
White Hat / Black Hat SEO | | HypermediaSystems
Is there a chance that this could be considered a blackhat even though the content is mine? From the Zeus documentation:
"Gateway aliases enable users to request files from the new
web server, and receive them as if they were on the new server, when they are
still located on the legacy server. To the user, the files appear to be located on
the new server. " Thank you.0 -
Duplicate Articles
We submit articles to a magazine which either get posted as text or in a flash container. Management would like to post it to our site as well. I'm sure this has been asked a million times but is this a bad thing to do? Do I need to a rel=canonical tag to the articles? Most of the articles posted to that other site do not contain a link back to our site.
White Hat / Black Hat SEO | | SirSud0