I'm worried my client is asking me to post duplicate content, am I just being paranoid?
-
Hi SEOMozzers,
I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries.
My client believes Google might like us a bit more if we had more "text" content.
So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media).
My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent.
I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content.
Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid?
Thanks everyone. This is my first post to the Moz community
-
I work with a lot of sites that have been affected by Panda and the type of thing that you are talking about doing is exactly the type of thing that has gotten most of these sites flagged by Panda.
You're client is right that it is a good idea to have text next on the pages. But, if the text is not unique then what Google does is say, "This page is essentially the same as one that is already in our index. There's no reason showing two identical pages to searchers so we won't show this one." If enough of your pages are duplicates then the whole site (including original pages) can be flagged by Panda.
-
Very helpful- I'm moving forward with this advice!
-
As an additional tip, you can use a service like Copyscape to verify whether or not the content has been posted elsewhere online.
-
Definitely sounds scalable for this site. Taking this type of shortcut with scraped content won't work. I would call it just that when you talk to the client, it's a "shortcut using scraped content" that Google has caught onto and suppressed. If the client is skeptical show him a link to the official Google forum where they talk against this.
Rewriting the content is easy and provides little hand holding, just make sure the person doing the writing has good writing skills and has English as a first language or it will read funky and at the end of the day you are creating content for the user. This is also the perfect opportunity to get a few instances of your keyword phrase into the content where it probably wasn't there before in the copied content!
-
Hi Steven,
Welcome to this community. The ideal response to your question would be to take the content that the client is providing and come up with unique content based on that material. So essentially rewriting those content pieces and giving your own flavor to them. Now , of course, due to various reasons that might not be possible (time, budget, resources). In that case it's best to give credit to the original source where you got the content from, when you add it to the site. More info in the links below:
-
Thanks Irving. It's only for 8 pages right now- but my client plans on posting more destinations (and thus more not-so-unique-content) in the future.
Re-writing is something I hadn't considered. That may be a more cost-efficient idea. Thanks for the idea!
-
Welcome aboard!
Content needs to be unique especially if you want to rank.
How many pages are we talking about, I would suggest you get the content re-written by someone if it's not a ton of pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Technical : Duplicate content and domain name change
Hi guys, So, this is a tricky one. My server team just made quite a big mistake :We are a big We are a big magento ecommerce website, selling well, with about 6000 products. And we are about to change our domaine name for administrative reasons. Let's call the current site : current.com and the future one : future.com Right, here is the issue Connecting to the search console, I saw future.com sending 11.000 links to current.com. At the same time DA was hit by 7 points. I realized future.com was uncorrectly redirected and showed a duplicated site or current.com. We corrected this, and future.com now shows a landing page until we make the domain name change. I was wondering what is the best way to avoid the penalty now and what can be the consequences when changing domain name. Should I set an alias on search console or something ? Thanks
White Hat / Black Hat SEO | | Kepass0 -
Would it be a good idea to duplicate a website?
Hello, here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com. Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4. We see 2 ways of doing this: we launch www.company2.com as a copy of www.company1.com. we launch www.company2.com as a completely different website. The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure. Do you think either of these approaches could result in penalties by the search engines? Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one? Thanks for your help!
White Hat / Black Hat SEO | | yacpro130 -
Goddady's Domain Masking and 301's
I have a client who's 7 domains and single website (instantpages®) exists within the clutches of GoDaddy. They own 6 kewyord rich domain names that 301 redirect with masking to the main branded domain. In effect, what this provides is the ability to add a title tag and meta description for a keyword rich domain name that displays content through an iframe. So really it's not duplicate content but this practice sets off my spidey sense that this is not a best practice regarding SEO. I want to suggest for the client to drop the idea of masking and do a straight 301 redirect to main branded domain. I'm sure that is fine but these domains are Not similar variations but actually vary widely: massage-city.com, city-massage.com, city-acupuncture.com, acupuncture-city.com, city-chiropractic.com, chiropractic-city.com etc ---- Doesn't Google frown on redirecting 6 domains to a single domain if they vary widely? Words of wisdom appreciated.
White Hat / Black Hat SEO | | superZj0 -
Copied Content/ Copied Website/
Hello guys, I was checking my product descriptions and I found out that there is a website that is using my descriptions word by word, also they use company name, product images, they have a link that sends you to my site, contact form.. I tried to purchase something and the order came through our email, but i made an inquire and it didn't come through. Also they have a sub-folder with my company name. Also they have url's with my company name, and this isn't right is it? I am confused and honestly I don't know what to do, we don't take part to any affiliation program or anything like that and we don't ship out of Europe. This is a Chinese website. Just for curiosity, I noticed that one of our competitors is there as well, and it does seem weird. Here is the links: www.everychina . com/company/repsole_limited-hz1405d06.html
White Hat / Black Hat SEO | | PremioOscar0 -
Negative SEO impacting client rankings - How to combat negative linking?
I have a client which have been losing rankings for the key term "sell gold" in Google AU. However, while doing some investigating I realized that we have been receiving links from bad neighborhoods such as porn, bogus .edu sites as well as some pharmaceutical sites. We have identified this as negative SEO and have moved forward to disavow the links in Google. However, I would like to know what other measures can be taken to combat this type of negative SEO linking? Any suggestions would be appreciated!
White Hat / Black Hat SEO | | dancape0 -
Can I report competitor for asking to guest post?
I just had an email from one of my least preferred competitor's SEO company asking about guest posting. They are already totally dominating the SERPs where they have no natural reason for being. Is there anywhere to bring this to the attention of the search engines?
White Hat / Black Hat SEO | | Cornwall0 -
Is it possible that since the Google Farmer's Update, that people practicing Google Bowling can negatively affect your site?
We have hundreds of random bad links that have been added to our sites across the board that nobody in our company paid for. Two of our domains have been penalized and three of our sites have pages that have been penalized. Our sites are established with quality content. One was built in 2007, the other in 2008. We pay writers to contribute quality and unique content. We just can't figure out a) Why the sites were pulled out of Google indexing suddenly after operating well for years b) Where the spike in links came from. Thanks
White Hat / Black Hat SEO | | dahnyogaworks0