Is this a duplicated content?
-
I have an e-commerce website and a separated blog hosted on different domains.
I post an article on my blog domain weekly. And I copy the 1st paragraph (sometimes only part of it when it's too long) of the article to my home page and a sub-catalog page. And then append it by anchor text "...more" which linked to the article.
1. Is that digest (1st paragraph) on my e-commerce site deemed as duplicated content by Google? Any suggestion?
2. In the future if I move the blog under the e-commerce website would it make any different with regards to this issue?
Thanks for your help!
-
While the blog format of showing a snippet with "read more" links is commonplace and perfectly acceptable, I would argue that it DOES matter whether the blog is on your domain or not. It would be better to have it on your domain. You don't want to seem like you're trying to beef up the content on your product and category pages by simply pulling in someone else's RSS feed snippets.
The longer you wait to move your blog over to your main domain the more difficult it will be. I would do this sooner rather than later.
-
Just as an afterthought, a slightly related issue... linking between your related blog and your ecomm sites shouldn't be any concern, unless you're dealing with several sites ("several", as in a LOT) http://www.youtube.com/watch?v=x0-jw_PfwtY
-
1. I agree with Simon, and Matt C. did one of his videos in May of 2011, where he stated that excerpts, linking to the entire post is perfectly acceptable. He also said he does it on his own blog (and he still does).
2. I can't see that it would make any difference at all.
-
I would suggest that this isn't really an issue so long as you only place a small snippet (such as the first paragraph) of the post on your e-commerce site. Just think how many blogs work - they will have a main blog page which usually contains the first few lines of each post with a link to take you to view each of the individual posts.
Take a look at Rand's blog for example http://moz.com/rand/
Here you will find snippets of each blog post but then a link to view the entire post on it's own URL. This isn't really considered duplicate because only a small portion of each blog post is present on the main blog page itself. The same applies in your case, even though they are on different domains.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Impact of wiping content on a subdomain
Hi, I've been asked to look at the impact of bulk deleting content on a blog subdomain and how it could impact the SEO of a linked www subdomain. Can deleting content on one subdomain have a negative impact on other linked subdomains? Thanks
White Hat / Black Hat SEO | | think-web0 -
Duplicate categories how to make sure I don't get penalized for this
Hi there How would I go about fixing duplicate categories? My products sell in multiple category areas and some overlap the other - how can I go about making sure that I don't get penalised for this? Each category and content is unique but my advisors offer different tools and insights.
White Hat / Black Hat SEO | | edward-may0 -
Can I use content from an existing site that is not up anymore?
I want to take down a current website and create a new site or two (with new url, ip, server). Can I use the content from the deleted site on the new sites since I own it? How will Google see that?
White Hat / Black Hat SEO | | RoxBrock0 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
White Hat / Black Hat SEO | | stever9990 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Duplicate Content
Hi, I have a website with over 500 pages. The website is a home service website that services clients in different areas of the UK. My question is, am I able to take down the pages from my URL, leave them down for say a week, so when Google bots crawl the pages, they do not exist. Can I then re upload them to a different website URL, and then Google wont penalise me for duplicate content? I know I would of lost juice and page rank, but that doesnt really matter, because the site had taken a knock since the Google update. Thanks for your help. Chris,
White Hat / Black Hat SEO | | chrisellett0 -
Multiple doamin with same content?
I have multiple websites with same content such as http://www.example.com http://www.example.org and so on. My primary url is http://www.infoniagara.com and I also placed a 301 on .org. Is that enough to keep away my exampl.org site from indexing on google and other search engines? the eaxmple.org also has lots of link to my old html pages (now removed). Should i change that links too? or will 301 redirection solve all such issues (page not found/crawl error) of my old webpages? i would welcome good seo practices regarding maintaining multiple domains thanks and regards
White Hat / Black Hat SEO | | VipinLouka780