Duplicate content site not penalized
-
Was reviewing a site, www.adspecialtyproductscatalog.com, and noted that even though there are over 50,000 total issues found by automated crawls, including 3000 pages with duplicate titles and 6,000 with duplicate content this site still ranks high for primary keywords. The same essay's worth of content is pasted at the bottom of every single page. What gives, Google?
-
Thanks SEOMAN.
My issue is that this purposeful duplicate content tactic is helping www.adspecialtyproductscatalog.com rank higher for a keyword like "ad specialty products" than http://www.advertisingproducts.net/, the former, using the same 4,000 words on every single page of it's site.
So if the goal is to rank #1 for the highest searched keyword in my category, should one employ this content tactic as part of their SEO efforts?
-
This is a common scenario, I would take issues found by crawling tools with a pinch of salt - A lot of them flag up errors that aren't actually a major problem.
Search engines aren't crawling the web to find problems with your site, they are crawling the web to find the most relevant content and deliver it to the person that requests it.
As far as I'm aware there is no such thing as a duplicate content penalty. But there is an exception, if you breach copyright you could be in legal trouble. Additionally Google does have a pirated content removal process, if you find a copied piece of content that breaches copyright you can submit it and it will most likely be removed completely from the search engine.
If you are talking about duplicate content on your own site, it isn't a major problem and Google will still pick up the pages and try and deliver the best content to the search except you are doing yourself a disservice. If you have duplicate content on your own site put a bit of effort into crafting content and targeting different keywords.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Social engineering content detected
hello, i have Got Social engineering content detected Message on webmaster tools on my around 20 sites, i have checked on server cleared, all unnecessary folders, But still i am not getting rectified this issue. One more error i got is Remove the deceptive content, But there is no any content on website which can harm my site, so kindly help & tell us steps we need take to resolve this issue, i am facing it from 10 days, yet not able to resolve, thnx in advance
White Hat / Black Hat SEO | | rohitiepl0 -
Forcing Entire site to HTTPS
We have a Wordpress site and hope to force everything to HTTPS. We change the site name (in wordpress settings) to https://mydomain.com In the htaccess code = http://moz.com/blog/htaccess-file-snippets-for-seos Ensure we are using HTTPS version of the site. RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] but some blogs http://stackoverflow.com/questions/19168489/https-force-redirect-not-working-in-wordpress say RewriteCond %{HTTPS} off RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] Which one is right? 🙂 and are we missing anything?
White Hat / Black Hat SEO | | joony0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Getting Back Links When I Cannot Add Outbound Links to My Site
I have a collection of websites that I do not control in terms of content or page creation/editing. As a result, I have no way to add links to outside sites on any existing or new pages. Given this, how can I go about finding and requesting other sites link back to our sites/pages if I cannot offer them a link to their site in return? I know that content is a link driver, but I do not control the content, so I cannot develop new content to help drive links. I appreciate any help/advice any experts can provide.
White Hat / Black Hat SEO | | dsinger0 -
Can anyone tell me why this site ranks so well?
Site in question: cellphoneshop.net From what I can tell from their link profile, the links they garner don't appear to be particularly high value but they dominate organic listings for my vertical (cell phone accessories), esp. in the last 2-3 months when Google was supposedly increasing the quality of their search results. Can anyone tell me why in particular this site ranks so well for competitive short and long tail terms?
White Hat / Black Hat SEO | | eugeneku0 -
Infographic submission sites potentially offering paid links....
Good Morning/Afternoon fellow Mozzers, I recently created an infographic and am now looking to get it distributed via as many publications as possible. I discovered some great sites with collections of infographics.However I have discovered a multitude of sites offering to review and feature the infographic, or "express" submissions so the graphic features faster for a price..... links below. http://www.amazinginfographics.com/submit-infographics/ http://infographicjournal.com/submit-infographics/ 2 questions 1. Is this considered as buying links? My instincts say Yes. 2. Some sites offer mix of free and "express" paid submissions. If the answer to Q.1 is yes, should I avoid them all together even if my graphic gets picked up free? Thanks in advance for the feedback.
White Hat / Black Hat SEO | | RobertChapman0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0