How detrimental is duplicate page content?
-
We have a local site wherein we have multiple advanced search parameters based on facilities available at a particular place. So for instance, we list a set of fun places to take kids to in a city. We have a page for this. We now have ability to select a list of fun places that have parking facility available or which are "outdoor". Now we use parameters to address these additional search criteria. Would search engines treat them as duplicate pages and in case it would how detrimental would this be?
-
As others had answered before, if the pages with parameters are just a consequence of a filter, but don't actually add nothing relevant (aka: substantially duplicated of the not parametered URLs) or nothing all, than the best idea is having those URLs with noindex meta robots.
This will ensure that those pages, if they were crawled, will disappear from the index.
But this is just a general rule, because can exists many variations to that same rule (and we don't know how really has been developed your site).
For instance, if those pages cannot be physically crawl because the filters are behind a Javascript selector (something that can be verified disabling Java in the browser), then you should not suffer issues and, eventually, using the meta robots "noindex" should be just a prevention not really an intervention to something already happened.
-
If you no-index, any link pointing to that page will waste its link juice.
If you must do that no-index,follow so the link juice can flow back out.
if your site is mainly duplicates then you have a problem, but if it is just a few pages, don't worry.
google will give credit to one page and will disregard the others. -
I guess it depends how much duplication there is. If the pages contain completely duplicate content with no unique content at all then the best move would be to noindex or nofollow them. Otherwise rel=canonical is probably fine.
-
Does rel="canonical" only indicate to Google the preferred page or does it also indicate that the content on the current page is duplicate in nature? Should it be better if we actually remove these pages from the index by providing for a "noindex" on the page?
-
Duplicate content is detrimental but the issue is relatively easy to solve. Just ensure you add rel="canonical" tags to the duplicate pages to allow Google to identify and rank the preferred page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing pages but they do not list on a brand search
One of my websites does not display sitelinks when using a branded search on mobile. Desktop , it is fine. Moz tells me I have no crawl issues. What could be reason for sitelinks not showing on mobile? Any thoughts?
Local Listings | | Web_Prosper_SEO0 -
Should I disavow local citation page links?
Hello Moz Community, I am worried about my link profile. I feel like there are so many low domain authority links coming from citation pages, or business listings. www.futuresolutionsmedia.com Would you recommend that I try to get rid of them, or just leave them be? Disavow them? MgircmN
Local Listings | | FutureSolutionsMedia0 -
Placement of products in URL-structure for best category page rankings
Hi! I have some questions regarding the optimal URL-hierarchy placement of products in a marketplace setting where the end goal is to attract traffic to category pages. Let me start off with some background, thanks in advance for the help. TLDR Goal: Increase category page rankings. Alternative 1 - Products and category pages separated, flat product structure. Category page: oursite.com/category/subcategory Product / listing page: oursite.com/listing-1 Alternative 2 - Products and category pages separated, hierarchal product structure. Category page: oursite.com/category/subcategory Product / listing page: oursite.com/product/category/subcat/listing Alternative 3 - Products placed directly under category page. Category page: oursite.com/category/subcategory Product / listing page: oursite.com/category/subcategory/listing I run a commercial real estate marketplace, which means that our potential search traffic is _extremely _geographic. For example, some common searches are (not originally in english): Office space for lease {City X} Office space for lease {Neighborhood Y} Retail space {Neighborhood Z} And so on... These terms are already quite competitive, where the top results are our competitors geographic and type category pages. For example: _competitor.com/type/city/neighborhood , _is a top result, where the user reaches a landing page that shows all the {type} spaces for lease in {neighborhood}. These users are out to find which spaces are available for lease in these geographical areas, and not individual spaces. I.e. users do not search in the same extent for an individual product, in this case a specific empty space. Our approach has been to place an extreme bias towards a heavy geographical hierarchy. This means that basically any search, resulting in a category page, on our site results in a well structured URL like the following: _oursite.com/type/state/city/district/street, _since we are using Google Maps API's, this is easy and relevant for the user. Our geographical categorization beats our competitors both on extensiveness and usability, especially in long-tail search phrases where our competitors don't care to categorize where we are seeing real search volumes. The hierarchy only extends as far down as the user has searched, for example a lot of our searched just end up being _oursite.com/type/state/city/district. _ Now we are wondering how we should place our products, the empty spaces, in this URL structure. Our original hypothesis was that we should include the products in the original hierarchy, resulting in: oursite.com/category/subcategory/product. Our thinking was that we would both be serving the user with an understandable and relevant URL, and also provide search bots with a logical structure for our site and most importantly content for our category pages. Our landing pages are very dynamic, providing information by relaying graphical information on a map instead of in an SEO-friendly manner. I would however go as far as to say that these dynamic pages provide a ton of value for the user, much more so than our competitors, by describing relevant information about the neighborhood kind of like Trulia, just not in a bot-readable manner. This results in trying to rank them on their own merits being a challenge, whereas we were hoping we could create relevancy by placing products / listings and maybe even blog posts on the topic within the same URL-hierarchy. As of right now our current structure is oursite.com/products/category/subcategory/product. In other words, they are categorized in the same geographical fashion but under a separate URL-path. Our results so far is that we basically only rank for the product pages, and rank extremely poorly for our category pages, which is our ultimate goal to enhance. This is why we developed the above hypothesis. However, what we learned when we did some initial research is that very few e-commerce stores place their products directly below their categories. Most of the major websites we studied, and we looked at quite a few, just go for **alternative 1 **from above. The crux is that most of them choose alternative 1 but simultaneously implement bread crumbs that emulate alternative 3, just without the actual URL's. So, what I'm asking is, what are the actual benefits or downsides of the three alternatives? I feel as if I have a pretty firm grasp on how this could be done, I just need to better understand why most seem to choose to flatline their products or listings in the alternative 1 fashion. Thanks, Viktor
Local Listings | | Viktorsodd0 -
Wrong Category Displaying Google Business Page?
Our firm keeps displaying "bankruptcy attorney" on google business page. Granted, we do that, as well as a variety of other services, but our primary category is "Personal Injury Attorney". I was told the categories are randomly selected, but I don't think that's true. Every time I've looked (or had other people look for us) on local, it displays as "bankruptcy attorney." What should I do? Is there a way to lock in the "Personal Injury Attorney" category, so it's the one that displays? Should I get rid of all the other categories except for "personal injury attorney?" Any other suggestion? Thanks, Ruben
Local Listings | | KempRugeLawGroup0 -
Local Optimisation without Local Pages?
What is the best and latest technique to optimise a website target lots of multiple local areas - I have a site where we want to target 10-15 local areas - at the moment the content mentions the local areas but not all of them as I felt it was going to turn into a list or keyword stuffing.I still see sites creating individual pages for each local area and including the areas in the url - the client wants to try and resist this as they do not want a lot of "bullsh*t" pages - (there exact words). What are the latest techniques or options? What have people tried and been successful with or equally failed with?
Local Listings | | JohnW-UK0 -
.CA and .COM (Ensuring no Duplicate Content)
Hello, I know that this has been answered before. I have a website that has the same content for both
Local Listings | | EVERWORLD.ENTERTAIMENT
http://example.com and http://example.ca/ 1. To Ensure I don't get penalized for duplicate content is there anything else I have to do besides adding the hreflang? Perhaps doing some stuff in the WMT?
2. Where do I add the hreflang? In the header section of the homepage? US site: http://example.com/" /> CA site: http://example.ca/" /> Thanks for you help?0 -
Unable to verify my google local listing page by phone verification.
Hi, I have created the Google local listing page for my business site. I want to verify it using the phone verification but there is only a option - "verify by postcard". Is there no option to verify it using a phone number? Help needed.
Local Listings | | SangeetaC0 -
G+ Local Business Page vs. Brand Page Problems
I'm struggling a bit with a Brand page vs. Local page on G+ and wondering if anyone here has had this same problem and found a solution.... This is related to a business that has a does have a physical address for a head admin office, but they provides a financial service to people across Canada over the phone. So although the business has an address and local phone number for admin purposes, it doesn't want people showing up at that address and definitely doesn't want to be considered a "Local" business. However, Google automatically creates the local listing in google maps, which the business has claimed but otherwise does not want to maintain. Instead the business has a Brand page on G+ (not local) which it has linked to the domain and actively maintains as their G+ business page. The trouble is, Google is associating showing the local listing as the rich snippet in in their organic result instead of the Brand page. Is there anything the company can do to further help Google associate the Brand G+ page with the website instead of the local listing? I already tried removing the link to the website from the local listing in hopes that would dis-associate it with the domain. That got rid of the rich snippet, but now the local listing shows up as a separate organic result just below the main company website, which is just as bad or maybe worse. To confirm, the website IS linked to the BRAND page using rel=publisher, and the brand page does have a verified link to the company domain. Thanks for the help!
Local Listings | | PlusROI1