I'm worried my client is asking me to post duplicate content, am I just being paranoid?
-
Hi SEOMozzers,
I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries.
My client believes Google might like us a bit more if we had more "text" content.
So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media).
My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent.
I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content.
Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid?
Thanks everyone. This is my first post to the Moz community
-
I work with a lot of sites that have been affected by Panda and the type of thing that you are talking about doing is exactly the type of thing that has gotten most of these sites flagged by Panda.
You're client is right that it is a good idea to have text next on the pages. But, if the text is not unique then what Google does is say, "This page is essentially the same as one that is already in our index. There's no reason showing two identical pages to searchers so we won't show this one." If enough of your pages are duplicates then the whole site (including original pages) can be flagged by Panda.
-
Very helpful- I'm moving forward with this advice!
-
As an additional tip, you can use a service like Copyscape to verify whether or not the content has been posted elsewhere online.
-
Definitely sounds scalable for this site. Taking this type of shortcut with scraped content won't work. I would call it just that when you talk to the client, it's a "shortcut using scraped content" that Google has caught onto and suppressed. If the client is skeptical show him a link to the official Google forum where they talk against this.
Rewriting the content is easy and provides little hand holding, just make sure the person doing the writing has good writing skills and has English as a first language or it will read funky and at the end of the day you are creating content for the user. This is also the perfect opportunity to get a few instances of your keyword phrase into the content where it probably wasn't there before in the copied content!
-
Hi Steven,
Welcome to this community. The ideal response to your question would be to take the content that the client is providing and come up with unique content based on that material. So essentially rewriting those content pieces and giving your own flavor to them. Now , of course, due to various reasons that might not be possible (time, budget, resources). In that case it's best to give credit to the original source where you got the content from, when you add it to the site. More info in the links below:
-
Thanks Irving. It's only for 8 pages right now- but my client plans on posting more destinations (and thus more not-so-unique-content) in the future.
Re-writing is something I hadn't considered. That may be a more cost-efficient idea. Thanks for the idea!
-
Welcome aboard!
Content needs to be unique especially if you want to rank.
How many pages are we talking about, I would suggest you get the content re-written by someone if it's not a ton of pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would this be duplicate content or bad SEO?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Backlinks from customers' websites. Good or bad? Violation?
Hi all, Let's say a company holds 100 customers and somehow getting a backlink from all of their websites. Usually we see "powered by xyz", etc. Is something wrong with this? Is this right backlinks strategy? Or violation of Google guidelines? Generally most of the customers's websites do not have good DA; will it beneficial getting a backlinks from such average below DA websites? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Why do these links violate Google's Quality Guideline?
My reconsideration request was declined by Google. Google said that some of the links to my site (www.pianomother.com) are still outside its quality guidelines. We provide piano lessons and sheet music on the site. Three samples are given. 1. http://www.willbeavis.com/links.htm 2. http://vivienzone.blogspot.com/2009/06/learning-how-to-play-piano.html 3. http://interiorpianoservice.com/links/ The first one is obvious because it is a link exchange page. I don't understand why the 2nd and 3rd ones are considered "inorganic links" by Google. The 2nd link is a blog that covers various topics including music, health, computer, etc. The 3rd one is a page of the site that provides piano related services. Other resources related to piano including my website are listed on the page. Please help. Thanks. John
White Hat / Black Hat SEO | | pianomother0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Duplicate Articles
We submit articles to a magazine which either get posted as text or in a flash container. Management would like to post it to our site as well. I'm sure this has been asked a million times but is this a bad thing to do? Do I need to a rel=canonical tag to the articles? Most of the articles posted to that other site do not contain a link back to our site.
White Hat / Black Hat SEO | | SirSud0 -
Here's some more proof white hat SEO works
I guess this is the most logical place to share this with you. I do SEO for many sites. I've recently been focusing on two in particular for the same client. We used Netfirms SEO services to get links--he insisted--which basically consists of writing articles in broken English and placing them all over blog networks with our desired anchor text. On the other site, I simply refused to employ those services. This was the client's main site, and was way too important to mess around with. I built links myself, the legit way. Long story short, for months I watched the shady, black hat site climb and climb in the SERPs, while the white hat one kept falling. This morning, I checked my SEOmoz campaigns and my white hat site went from #8 to #2 and my black hat site went from page 2 to no longer being in the top 50. Just another example of what's been happening with Google lately and how great it is. Interestingly, the black hat site never got a warning in GWT about buying links. Now I just have to figure out a way to break the news to my boss and tell him I told him so without actually using those words.
White Hat / Black Hat SEO | | UnderRugSwept5 -
Multiple doamin with same content?
I have multiple websites with same content such as http://www.example.com http://www.example.org and so on. My primary url is http://www.infoniagara.com and I also placed a 301 on .org. Is that enough to keep away my exampl.org site from indexing on google and other search engines? the eaxmple.org also has lots of link to my old html pages (now removed). Should i change that links too? or will 301 redirection solve all such issues (page not found/crawl error) of my old webpages? i would welcome good seo practices regarding maintaining multiple domains thanks and regards
White Hat / Black Hat SEO | | VipinLouka780 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0