I'm worried my client is asking me to post duplicate content, am I just being paranoid?
-
Hi SEOMozzers,
I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries.
My client believes Google might like us a bit more if we had more "text" content.
So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media).
My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent.
I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content.
Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid?
Thanks everyone. This is my first post to the Moz community
-
I work with a lot of sites that have been affected by Panda and the type of thing that you are talking about doing is exactly the type of thing that has gotten most of these sites flagged by Panda.
You're client is right that it is a good idea to have text next on the pages. But, if the text is not unique then what Google does is say, "This page is essentially the same as one that is already in our index. There's no reason showing two identical pages to searchers so we won't show this one." If enough of your pages are duplicates then the whole site (including original pages) can be flagged by Panda.
-
Very helpful- I'm moving forward with this advice!
-
As an additional tip, you can use a service like Copyscape to verify whether or not the content has been posted elsewhere online.
-
Definitely sounds scalable for this site. Taking this type of shortcut with scraped content won't work. I would call it just that when you talk to the client, it's a "shortcut using scraped content" that Google has caught onto and suppressed. If the client is skeptical show him a link to the official Google forum where they talk against this.
Rewriting the content is easy and provides little hand holding, just make sure the person doing the writing has good writing skills and has English as a first language or it will read funky and at the end of the day you are creating content for the user. This is also the perfect opportunity to get a few instances of your keyword phrase into the content where it probably wasn't there before in the copied content!
-
Hi Steven,
Welcome to this community. The ideal response to your question would be to take the content that the client is providing and come up with unique content based on that material. So essentially rewriting those content pieces and giving your own flavor to them. Now , of course, due to various reasons that might not be possible (time, budget, resources). In that case it's best to give credit to the original source where you got the content from, when you add it to the site. More info in the links below:
-
Thanks Irving. It's only for 8 pages right now- but my client plans on posting more destinations (and thus more not-so-unique-content) in the future.
Re-writing is something I hadn't considered. That may be a more cost-efficient idea. Thanks for the idea!
-
Welcome aboard!
Content needs to be unique especially if you want to rank.
How many pages are we talking about, I would suggest you get the content re-written by someone if it's not a ton of pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I redesigned a clients website and there is a pretty massive drop in traffic - despite my efforts to significantly improve SEO.
Hi there, I redesigned a clients website that was very old fashioned and was not responsive. I implemented 301 redirects, kept the content pretty similar, website linking structure very similar - the only things i changed was making the website responsive, improved title tags, added a bit more information, improved the footer and h1 tags etc.. however although clicks are fairly similar search impressions have dropped about 60% on average over the past week. The old site had some keywords linking to pages with no new content so i removed those as seemed like black hat seo tricks and also there was a huge list of "locations we deliver to" on the homepage followed by around 500 citys/towns I removed this. Could this be the cause for the drop? as i assumed those would do more harm than good? Fairly new with SEO as you can probably tell. Looking for advice on what may be the cause and what steps I should take now. Thanks for reading! duGeW
White Hat / Black Hat SEO | | binkez321 -
Does linking older posts help?
Asking a blogger to add an anchor text into their old post that relates to my niche. does that help with backlinks? does the quality of backlinks determine by how new the post is or the page rank determines all? for example a new post with lesser page rank vs a old post with higher page rank which one is better to put your link on?
White Hat / Black Hat SEO | | andzon0 -
Duplicate content for product pages
Say you have two separate pages, each featuring a different product. They have so many common features, that their content is virtually duplicated when you get to the bullets to break it all down. To avoid a penalty, is it advised to paraphrase? It seems to me it would benefit the user to see it all laid out the same, apples to apples. Thanks. I've considered combining the products on one page, but will be examining the data to see if there's a lost benefit to not having separate pages. Ditto for just not indexing the one that I suspect may not have much traction (requesting data to see).
White Hat / Black Hat SEO | | SSFCU0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
DIV Attribute containing full DIV content
Hi all I recently watched the latest Mozinar called "Making Your Site Audits More Actionable". It was presented by the guys at seogadget. In the mozinar one of the guys said he loves the website www.sportsbikeshop.co.uk and that they have spent a lot of money on it from an SEO point of view (presumably with seogadget) so I decided to look through the source and noticed something I had not seen before and wondered if anyone can shed any light. On this page (http://www.sportsbikeshop.co.uk/motorcycle_parts/content_cat/852/(2;product_rating;DESC;0-0;all;92)/page_1/max_20) there is a paragraph of text that begins with 'The ever reliable UK weather...' and when you via the source of the containing DIV you will notice a bespoke attribute called "threedots=" and within it, is the entire text content for that DIV. Any thoughts as to why they would put that there? I can't see any reason as to why this would benefit a site in any shape or form. Its invalid markup for one. Am I missing a trick..? Thoughts would be greatly appreciated. Kris P.S. for those who can't be bothered to visit the site, here is a smaller version of what they have done: This is an introductory paragraph of text for this page.
White Hat / Black Hat SEO | | yousayjump0 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
How to Not Scrap Content, but still Being a Hub
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic. One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site. For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight). We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site. So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles. One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub. Thoughts? Thank you, nick 3dLVv
White Hat / Black Hat SEO | | nwright0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0