Is this a duplicated content?
-
I have an e-commerce website and a separated blog hosted on different domains.
I post an article on my blog domain weekly. And I copy the 1st paragraph (sometimes only part of it when it's too long) of the article to my home page and a sub-catalog page. And then append it by anchor text "...more" which linked to the article.
1. Is that digest (1st paragraph) on my e-commerce site deemed as duplicated content by Google? Any suggestion?
2. In the future if I move the blog under the e-commerce website would it make any different with regards to this issue?
Thanks for your help!
-
While the blog format of showing a snippet with "read more" links is commonplace and perfectly acceptable, I would argue that it DOES matter whether the blog is on your domain or not. It would be better to have it on your domain. You don't want to seem like you're trying to beef up the content on your product and category pages by simply pulling in someone else's RSS feed snippets.
The longer you wait to move your blog over to your main domain the more difficult it will be. I would do this sooner rather than later.
-
Just as an afterthought, a slightly related issue... linking between your related blog and your ecomm sites shouldn't be any concern, unless you're dealing with several sites ("several", as in a LOT) http://www.youtube.com/watch?v=x0-jw_PfwtY
-
1. I agree with Simon, and Matt C. did one of his videos in May of 2011, where he stated that excerpts, linking to the entire post is perfectly acceptable. He also said he does it on his own blog (and he still does).
2. I can't see that it would make any difference at all.
-
I would suggest that this isn't really an issue so long as you only place a small snippet (such as the first paragraph) of the post on your e-commerce site. Just think how many blogs work - they will have a main blog page which usually contains the first few lines of each post with a link to take you to view each of the individual posts.
Take a look at Rand's blog for example http://moz.com/rand/
Here you will find snippets of each blog post but then a link to view the entire post on it's own URL. This isn't really considered duplicate because only a small portion of each blog post is present on the main blog page itself. The same applies in your case, even though they are on different domains.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we get de-indexed for changing some content and tags frequently? What is the scope in 2017?
Hi all, We are making some changes in our website content at some paragraphs and tags with our main keywords. I'm just wondering if this is going to make us de indexed from Google? Because we recently dropped in rankings when we added some new content; so I am worried whether there are any chances it will turn more risky when we try to make anymore changes like changing the content. There are actually many reasons a website gets de indexed from Google but we don't employ any such black hat techniques. Our website got a reputation with thousands of direct traffic and organic search. However I am curious to know what are the chances of getting de indexed as per the new trends at Google? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Competitor ranking well with duplicate content—what are my options?
A competitor is ranking #1 and #3 for a search term (see attached) by publishing two separate sites with the same content. They've modified the title of the page, and serve it in a different design, but are using their branded domain and a keyword-rich domain to gain multiple rankings. This has been going on for years, and I've always told myself that Google would eventually catch it with an algorithm update, but that doesn't seem to be happening. Does anyone know of other options? It doesn't seem like this falls under any of the categories that Google lists on their web spam report page—is there any other way to get bring this up with the powers that be, or is it something that I just have to live with and hope that Google figures out some day? Any advice would help. Thanks! how_to_become_a_home_inspector_-_Google_Search_2015-01-15_18-45-06.jpg
White Hat / Black Hat SEO | | inxilpro0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Rel author and duplicate content
I have a question if a page who a im the only author, my web will duplicate content with the blog posts and the author post as they are the same. ¿what is your suggestion in that case? thanks
White Hat / Black Hat SEO | | maestrosonrisas0 -
What happens when content on your website (and blog) is an exact match to multiple sites?
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example: http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/ If you google the title of that blog article you find tons of the same article all over the place. So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
White Hat / Black Hat SEO | | MorganPorter0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0 -
Duplicate Articles
We submit articles to a magazine which either get posted as text or in a flash container. Management would like to post it to our site as well. I'm sure this has been asked a million times but is this a bad thing to do? Do I need to a rel=canonical tag to the articles? Most of the articles posted to that other site do not contain a link back to our site.
White Hat / Black Hat SEO | | SirSud0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ...
White Hat / Black Hat SEO | | GerryWeitz<cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ...
<cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
0