On-site duplication working - not penalised - any ideas?
-
I've noticed a website that has been set up with many virtually identical pages. For example many of them have the same content (minimal text, three video clips) and only the town name varies. Surely this is something that Google would be against? However the site is consistently ranking near the top of Google page 1, e.g. http://www.maxcurd.co.uk/magician-guildford.html for "magician Guildford", http://www.maxcurd.co.uk/magician-ascot.html for "magician Ascot" and so on (even when searching without localisation or personalisation).
For years I've heard SEO experts say that this sort of thing is frowned on and that they will get penalised, but it never seems to happen. I guess there must be some other reason that this site is ranked highly - any ideas? The content is massively duplicated and the blog hasn't been updated since 2012 but it is ranking above many established older sites that have lots of varied content, good quality backlinks and regular updates.
Thanks.
-
Egol is right - pages like this can persist in quieter niches, especially if there are not a huge number of other results to take the thin results' place. I notice that this guy's page for London has a lot more content, probably indicative of the fact that the competition for rankings and business is a lot higher in London than it is in surrounding areas.
-
(My Experience) I am having the exact same problem currently,
It is an absolute nightmare being outranked by a poor site, with poor content and a poor user experience,
However through Moz advice i have been told to keep doing what we are doing publishing quality content the right way and focus on the quality of your website, making improvements,building links and you will be rewarded,
Nobody can escape the Panda forever
http://searchengineland.com/library/google/google-panda-update
Hope this helps a little,
James
-
Google has been killing sites with lots of cookie-cutter pages for at least 8 to 10 years. In busy niches they get killed quickly. In sleepy niches they can persist a lot longer.
For years I've heard SEO experts say that this sort of thing is frowned on and that they will get penalised, but it never seems to happen.
Those experts are right nearly 100% of the time. These sites rarely survive today. So, I think that your use of the term "never seems to happen" is because you are frustrated with this guy.
On rare instances, I see a site like this persist in the SERPs of a sleepy niche for years and years. They often have a lot of people talking about them, giving them lots of social attention, citations, mentions, etc. I believe that may have something to do with why they survive.
Have you seen this guy perform? If you haven't, maybe you should.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site traffic halved not sure why
Hi guys, not sure if anyone can help, but we had a client's google organic traffic literally halve from the week at the end of August to September (29 Aug 2016 to be precise) and it hasn't recovered since (here's a screenshot from GA http://puu.sh/sAmd3/b071dd1e57.png) I've been doing a lot of digging around on Moz and elsewhere about any Google updates that may have gone through around that time and there doesn't seem to be anything that I would think would affect it. I thought it might be to do with Penguin, but that doesn't seem to be the case. A while ago before then we did have some domains and pages 301 redirected to the main site when multiple other sites were rolled into the one, but I wouldn't have thought that should affect it. After that I've also gone and removed all those sites and redirects too (couple of weeks ago) but that doesn't seem to have fixed it. There's no black hat SEO done on the site so very odd to have this happen. I'm rather out of ideas what it could be that has impacted things so suddenly and that we couldn't get it recovered from. Any ideas would be much appreciated.
White Hat / Black Hat SEO | | BrisbaneSEOWorks0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
How to know if a link in a directory will be good for my site?
Hi! Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link. Looking at the metrics of that directory, I found the following: Domain Authority: 70; main page authority: 76; linking domain roots: 1383; total links: 94663 (several anchor texts); facebook shares: 26; facebook likes: 14; tweets: 20; Google +1: 15. The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory). Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
White Hat / Black Hat SEO | | te_c0 -
Can anyone tell me why this site ranks so well?
Site in question: cellphoneshop.net From what I can tell from their link profile, the links they garner don't appear to be particularly high value but they dominate organic listings for my vertical (cell phone accessories), esp. in the last 2-3 months when Google was supposedly increasing the quality of their search results. Can anyone tell me why in particular this site ranks so well for competitive short and long tail terms?
White Hat / Black Hat SEO | | eugeneku0 -
Google SEVERE drop as of last week (oct 10) on long standing .org site
Hello Experts Wanted some imput if possible. I own a .org informational site that has been #1 in its category for Google Yahoo and Bing under a major keyword for years. The site is aged back to 2005 and all of the sudden it dropped on August 10 (Google only- Yahoo and Bing still #1)) but remained atop the primary keywords that it is namesaked for .org (xxxxyyyzzz.org) and then Oct 9-10 it dropped from the page 1 top ranking it had for years on that primary keyword to page 13. I dont know where to begin to look. Any ideas how something like this could happen and what "Stones" I should turn. We purchased the website and are not SEO gurus so just not sure. Any help would be appreciated
White Hat / Black Hat SEO | | TBKO1 -
Need some advise on using a micro site
I thought I would use a micro site with just some main product landing pages being used. I would use the same design and code as main site, then re-write the text and then link everything to the new site. “BUT” I'm concerned about getting a penalty (duplicate) as all the anchor text links going to the main site would be identical! EG. To use the same design as the main site I would need to use the same layout etc including navbars, anchor text links in the footer etc.. and I'm worried this may trigger a duplicate content penalty ? Any advise please
White Hat / Black Hat SEO | | doorguy880 -
Google Penalising Pages?
We run an e-commerce website that has been online since 2004. For some of our older brands we are getting good rankings for the brand category pages and also for their model numbers. For newer brands, the category pages aren't getting rankings and neither are the products - even when we search for specific unique content on that page, Google does not return results containing our pages. The real kicker is that the pages are clearly indexed, as searching for the page itself by URL or restricting the same search using the site: modifier the page appears straight away! Sometimes the home page will appear on page 3 or 4 of the rankings for a keyword even though their is a much more relevant page in Google's index from our site - AND THEY KNOW IT, as once again restricting with the keywords with a site: modifier shows the obviously relevant page first and loads of other pages before say the home page or the page that shows. This leads me to the conclusion that something on certain pages is flagging up Google's algorithms or worse, that there has been manual intervention by somebody. There are literally thousands of products that are affected. We worry about duplicate content, but we have rich product reviews and videos all over these pages that aren't showing anywhere, they look very much singled out. Has anybody experienced a situation like this before and managed to turn it around? Link - removed Try a page in for instance the D&G section and you will find it easily on Google most of the time. Try a page in the Diesel section and you probably won't, applying -removed and you will. Thanks, Scott
White Hat / Black Hat SEO | | scottlucas0 -
Somebody hacked many sites and put links to my sites in hidden div
I had 300 good natural links to my site from different sites and site ranked great for my keywords. Somebody (I suppose my competitor) has hacked other sites 2 days ago (checked Google cache) and now Yahoo Site Explorer shows 600 backlinks. I've checked new links - they all are in the same hidden div block - top:-100px; position:absolute;. I'm afraid that Google may penalize my site for these links. I'm contacting webmasters of these sites and their hosting so they remove these links. Is it possible to give Google a notice that these links are not mine so it could just skip them not penalizing me? Is it safe to make "Spam report" regarding links to my own site?
White Hat / Black Hat SEO | | zarades0