Content - Similar but not exactly the same content - Duplicate or Spammy?
-
Hey, so I have been wondering for some time now as some pages will get indexed and others won't appear at all. That makes me think that I am either creating to similar content or it is becoming too spammy.
Take these two pages I created for example. The body content is very similar but h tags, meta tags and title are different. So my questions is; would pages not be displaying due possibly being too similar and spammy or duplicate?
I have linked two pages that are very similar below and would love to hear any thoughts about it.
Any feedback would be greatly appreciated. Thanks in advance.
-
Lots of people grab content and republish it.
Lots of people grab the same content and republish it.
The first few people who do it have the best chance of getting away with it. But, if you are the tenth or the twentith, then you are more likely to be ignored by Google. (After you republish this duplicate, Google might find it, index it, and rank it... then some months down the road they realize that your stuff is duplicate and take action against it.)
The exception to the above is when you are a powerful publisher. Then you can get away with a lot more than other republishers, and you might even outrank the original source.
"If you've realized that your local industry is riddled with poor quality content, see this as your opportunity to beat out lazier competitors. If you deliver the superior experience, it may give you a very valuable edge, while also safeguarding your reputation and rankings against Google filters and penalties in future."
This is so true and so surprising. There are still a lot of topics on the internet that are not covered by substantive, high quality content written by an authoritative author.
-
Thanks for coming back with further questions on this. Unfortunately, changing tags and maps doesn't make your text content "different enough" to lessen the concern that Google may view what you are doing as duplicative.
Basically, your business model is a local business with a single location. If you wish to gain organic visibility for your services beyond that single location, it's true that developing landing pages for these other locations is a best practice ... only provided that the content of them is actually useful and largely unique. You are in a similar scenario to a plumber, who has a single location from which he offers a set of services to a variety of neighboring towns. His Local SEO is going to be anchored to his city of location, but his organic SEO can branch out to represent his work in the other towns he serves. There is nothing spammy about him featuring this work in his service cities, but unless he has something unique to say about his work there, he's going to end up with a weak site burdened with duplicate content clearly designed for search engines instead of for the assistance of consumers.
I recommend taking a look at a blog post I wrote here a couple of years ago that offers tips for creating strong, diversified landing pages in a scenario much like yours:
https://moz.com/blog/overcoming-your-fear-of-local-landing-pages
You will need to dig deep into your resources to create this type of useful, unique content.
As for your competitors, your question is reasonable. If Google doesn't like thin, duplicate content, why do we see people getting away with it? To this, my answer is:
-
People were getting away with all kinds of things the day before an update like Penguin or Panda. They woke up the following day to a changed world in which their lack of effort was no longer being rewarded.
-
If you've realized that your local industry is riddled with poor quality content, see this as your opportunity to beat out lazier competitors. If you deliver the superior experience, it may give you a very valuable edge, while also safeguarding your reputation and rankings against Google filters and penalties in future.
Hope this helps!
-
-
Thank you for the feedback.
I see all the points mentioned but I still feel confused about it. I have worked in the industry for 5 years now and some of the website that do this have been ranking well for at least that long, would google not have penalised these website by now? I started to do it to and alas, I also started to rank better and get more inquiries.
I want to make one point here. Even though the content is similar. It is not the same and I have clearly changed factors that will help google understand what I am trying to achieve. Meta tags, title, h tags, even maps on the pages. Location to location is what I am trying to factor in with this project. Even though the content is very similar. Is it still not unique? Where is the line? that is what I find confusing. Do I spend more time making pages unique even though I don't need to? Will that be better in the long run and keep the pages positioned better for longer? How unique does the content have to be?
-
Seconding the opinions of Clive and EGOL here. And, particularly want to highlight EGOL's point about not imitating competitors' poor practices. It might help to view these competitors as being just one Google action away from getting dinged for this strategy.
It seems like the challenge for you here is to create something that helps customers understand the geography of your services, without simply duplicating the same page and swapping out city names. I would recommend putting some creative resources behind figuring out how to meet this challenge.
-
Clive Morley is right. 100% right.
These pages are close enough to being duplicates and Google will likely filter one of them from the search results. Maybe they will filter both of them from the search results.
If you want to compete for slightly different keywords then you will need to produce unique and substantive content for every one of them.
Many competitors that rank much better do exactly the same.
This is sometimes true. But Google has detected that you are doing it. Someday Google might detect that they are doing it.
So, now it is up to you to stop taking shortcuts and do the work required to present unique value for each page on your website.
-
Hello,
These pages are virtually identical and I wouldn't be surprised if Google views these as duplicate or spammy.
Both of these pages serve the same user intent - moving from Gold Coast to another location - and it's therefore worth considering having just one page that covers all of the numerous destinations.
Both pages have an unnatural amount of keyword usage which may also trigger a spammy issue.
If the purpose is to rank for "moving from Gold Coast to...." type searches than this page would probably benefit from having more Gold Coast related content - geographical info, images, etc.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much do I have to differentiate syndicated content, exactly?
We have about 15-20 articles we'll repurpose on a partner domain (think: media outlet). To avoid duplicate content suspicion, how much exactly do we need to differentiate the content on the second domain? Yea, this is assuming we can't obtain a canonical for whatever reason. I've found some good advice here, but am looking for some quantification. Like: "A sentence/paragraph of introduction at the top of the piece, plus a link back to the original at the end of said introduction ought to do it." Any help is appreciated. Thanks! Tim
Content Development | | Jen_Floyd0 -
Wordpress Blog Pages, Duplicate Title Tag
Anyone have any experience in fixing the duplicate Title tag on a Wordpress blog multiple pages Basically the title tag remains the same on the pages /Blog/ /Blog/Page/2/ /Blog/Page/3/ My good friend Yoast Plugin doesn't seem to of resolved this (Unless i have missed something?) I don't really see this to be effecting anything and wouldn't of through it would either, but it would be nice to not see the notification within Moz site crawls and campaigns etc, its more of a cosmetic problem Any solutions ? Thanks James
Content Development | | Antony_Towle0 -
Duplicates from weird domains
My sit is http://www.webdesign.org/, but the other day I found these sites that have duplicates of my original site: <a>http://wordpresswww.webdesign.org/</a><a>http://fdsfsdswww.webdesign.org/</a><a>http://gfdgdfgdfgwordpresswww.webdesign.org/</a><a>http://w54354353w.webdesign.org/</a>http://wojhkhjkhw.webdesign.org/What really freaks me out is that the content on those sites is 100% up to date. Same as on http://www.webdesign.org/. Now here's my question. 1. Since my site http://www.webdesign.org/ is the root domain, I take it that I can somehow disable those sites (subdomains) from my domain 'admin panel' or what not?2. If you can use those subdomains even if you don't own the root domain (http://www.webdesign.org/), it looks that some negative SEO has been done to my site?Which of my assumptions are right? Please help me to figure that out.
Content Development | | VinceWicks0 -
Is framed content on another domain duplicate content?
I've read a number of articles and am getting opposing answers. I've been checking pages of Photo.net in copyscape for duplicate content. I'm finding a number of domains have the site iframed onto them. I was wondering why copyscape could read the content if the search engines supposedly didn't crawl iframes. Copyscape said Google can read the content. I just want to know if these sites need to remove the iframe (is it hurting Photo.net)? Thanks. Examples: http://www.copyscape.com/?q=http%3A%2F%2Fphoto.net%2Freviews%2F
Content Development | | cakelady0 -
Marking our content as original, where the rel=author tag might not be applied
Hello, Can anyone tell, if it is possible to protect text –type content without the rel=author tag? We host a business listing site, where, apart from the general contact information, we have also started to write original 800+ character-long unique and original contents for the suppliers, where we expect visits, so rankings should be increased. My issue is that this is a very competitive business, and content crawling is really an everyday practice. Of course, I would like to keep my original content or at least mark it as mine for Google. The easiest way would be the author tag, but the problem is, that I do not want our names and our photos to be assigned to these contents, because from one hand, we are not acknowledged content providers on our own (no bio and whatsoever), and on the other hand, we provide contents for every sort of businesses, so just having additional links to our other contents, might not help readers to get what they want. I also really do not think that a photo of me could help increase the CTR from the SERP:) What we currently do, is that we submit every major fresh content through url submission in WMT, hoping that first indexing might help. We have only a handful of them within a day, so not more than 10. Yes, I could perhaps use absolute links, but this one is not a feasible scenario in all cases, and about DMCA, as our programmer says, what you can see on the internet, that you can basically own. So finally, I do not mind our contents being stolen, as I can’t possibly prevent this. I want however our original content to be recognized as ours by Google, even after the stealing is done. (Best would be an ’author tag for business’, so connected to our business Google+ page, but I am not aware, this function can be used this way.) Thank you in advance for all of you, sharing your thoughts with me on the topic.
Content Development | | Dilbak0 -
Is there a way to repost content (with permission) to another site without being penalized by Google?
I write a monthly Social Media Marketing column for a local Business Journal and the column is printed in their paper as well as posted on their website. Is there any way I can repost these articles on my website's blog without being penalized by Google for "duplicate content"?
Content Development | | vyki0 -
Duplicate content problems, so why does WordPress post onto Tumblr?
Hi Everyone, Basically I know that if you have duplicate content your ranking is effected! This I understand, so why is it that WordPress has the option to post your blog entry straight onto your paired Tumblr account? Surely if this can be done, I can have the same content on the company website but also on WP and Tumblr? Or is there some sort of method to how it works specifically for those 2 websites? Thanks in advance.
Content Development | | MariusFermi0 -
Different pages with very similar H1's - will I get a penalty?
I currently have text articles about various topics on my site - for this example say I have written about "Negative Reinforcement". This article is live over a month now and getting listed on the first google page with an added keyword which the article is specifically related to. Now I want to create an infographic related to the same topic "Negative Reinforcement" - but I want to call the page this infographic will go on "Negative Reinforcement Infographic" while the article is currently called just "Negative Reinforcement". Neither page will have duplicate content from each other. The article is 2,000+ words so I don't want to throw more into it by adding on the infographic and I want to specifically create the infographic as linkbait and on a stand alone page. I am curious if adding another page with such a similar title and H1 have any negative effect ("dilute" the strength of the other article), for example will google take having two different pages with such a similar heading as potential keyword stuffing and penalise the site?
Content Development | | GavinC0