Duplicate Page Content
-
Hi there,
We keep getting duplicate page content issues. However, its not actually the same page.
E.G - There might be 5 pages in say a Media Release section of the website. And each URL says page 1, 2 etc etc. However, its still coming up as duplicate. How can this be fixed so Moz knows its actually different content? -
Thanks all - will give those options a try and see which works the best for us.
-
Hi!
I suggested the noindex in order to deindex pages that maybe are already indexed. But, yes, the rel="canonical" should be doing the same (the problem is that Google may not respect it).
The nofollow is order to not letting the crawler wasting budget crawl following the links of those (many) pages.
-
Gianluca,
Wouldn't be much more work to identify if the parameter is set and then add the noindex meta? Wouldn't be easier to just set the canonical? I'm sure that's a dynamic site, just one canonical cal without using any extra code (PHP or whatever).
Why the nofollow? If I just preventing that page for being indexed as it would constitute a duplicate content issue, why the nofollow?, noindex should be enough in this case.
We recently fixed a similar issue with our blog tags, showing duplicate content on about 400 pages. We fixed that by adding the noindex (they already had the canonical but it wasn't enough as the canonical couldn't point to a definite version as that changed if the tag had or not another post on it). Within a few days all those pages were deindexed, we noticed a loss in search traffic and I decided to run a small test removing the noindex tag. Results: 2 weeks later none of those pages returned to the index (I added the noindex tag back as it was just a test to see if we could regain that traffic, but ultimately decided it wouldn't help to have a duplicate content issue for that lost traffic).
-
Federico is right.
Your duplicated content issue is due to the date parameters, hence you are potentially duplicating every page having that calendar for all the possible combination of dates... and that is an huge issue.
You should implement the rel="canonical" in order to have all these kind of URLs having as canonical the URL without the parameter.
Or, even better, you should implement the meta robots "noindex,nofollow" in every date parametered URL.
Said that, the most logical thing to do was to block these URLs via robots.txt when launching the site. Unfortunately, now blocking these URLs is not enough, as they are already indexed (even if they not appear in the index because they are filtered out by Google).
-
Ah you mean that if the dates of the reservation changes then it creates a duplicate page content?
If that's the case, you should use the rel="canonical" the definite page, no dates selected, just the page that shows the property.
-
Did you try adding the rel="canonical" tag to the pages?
-
So they might look at this page: http://www.hihh.com.au/property-details?hihhpropertyId=HCP006&checkin=2013-08-06&checkout=2013-08-09&search=checkindate%3D2013-08-06%26checkoutdate%3D2013-08-09
Then the same page would come up on the error list but with different dates.
-
Can you provide us with some examples? It would make our job easier
-
Its basically all seperate pages/URL's with different information on each. However it seems to be crawled for each possible range of that page. e.g for check in/check out dates. It will search a range of dates and think that each page has different information. However, its all exactly the same.
-
Is the issue on the pagination? as sometimes some pages from categories/tags/etc can have the same content within an exact page.
If that's the issue, I would recommend you add a noindex meta to the least important pages (tags for example).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content? other issues? using vendor info when selling their prodcuts?
When building content for vendors that we sell their products? best practices? ok, to copy and paste "about us" info? or will that be flagged as duplicate content.
On-Page Optimization | | bakergraphix_yahoo.com0 -
Create 100% new content in existing page/URL?
We have about 30 pages of content that we are hiding from the website as these articles had some issues. If these pages ranked well, would you recommend that the new content is written within these pages? Meaning, we would replace the content that's in those pages with the same topic and keywords. Or do you think it's best to start a new page instead?
On-Page Optimization | | kvillalobos0 -
Duplicate home page URL on crawl test
Hi i just recently made a crawl test but before doing that i made sure that i have no more duplicates on my site i am using joomla and as of now i only have 11 links on my site but when my crawl test is done i saw duplicate url of my homepage the duplicate url has a trailing backslash so basically i have all the 11 links + 1 duplicate URL http://mangthomas.com http://mangthomas.com/ can you guys give advise how i can remove the duplicate i dont even know which one to retain. THANKS A LOT, cris
On-Page Optimization | | crisbasma0 -
Empty public profiles are viewed as duplicate content. What to do?
Hi! I manage a social networking site. We have a lot of public user profiles that are viewed as duplicate content. This is because these users haven't filled out any public profile info and thus the profiles are "empty" (except for the name). Is this something I should worry about? If yes, what are my options to solve this? Thanks!
On-Page Optimization | | thomasvanderkleij0 -
Duplicate content on ecommerce
We have a website that we created a little over a year ago and have included our core products we have always focused on such as mobility scooters and power wheelchairs. We have been going through and updating product descriptions, adding product reviews that our customers have provided etc in order to improve on our SEO rankings and not be penalized by the Panda update. We were approached by a manufacturer last year about their products and they had close to 10k products that we were able to upload easily into our system. Obviously these all have standard manufacturers descriptions many sites are also using. It will take us forever to go through and change all of these and many products are similar to each other anyway they just vary in size, color etc. Will it help our rankings for our core products to simply go through and delete all of these additional products and categories and just add them one by one with unique descriptions and more detailed information when we have time? We aren't really selling many of them anyway so it won't hurt our sales. I'm clearly new to SEO and any help at all would be greatly appreciated. My main website is www.bestmedicalsuppliesonsale dot com A sample core category that we have changed descriptions for is http://www.bestmedicalsuppliesonsale.com/mobility-scooters-s/36.htm A sample of a category and products we simply uploaded would be at http://www.bestmedicalsuppliesonsale.com/Wound-Care-s/4837.htm I'm open to all suggestions I would just like to see my traffic and obviously sales increase. If there are any other glaring problems please let me know. I need help!
On-Page Optimization | | BestMedical0 -
Offer landing page, duplicate content and noindex
Hi there I'm setting up a landing page for an exclusive offer that is only available (via a link) to a particular audience. Although I've got some specific content (offer informaiton paragraph), i want to use some of the copy and content from one of my standard product pages to inform the visitors about what it is that i'm trying to sell them. Considering I'm going to include a noindex on this page, do i need to worry about it having some content copied directly from another page on my site? Thanks
On-Page Optimization | | zeegirl0 -
Page without content
Hey Everyone, I've started an SEO On Page analysis for a web site and I've found a lot of duplicate content and useless pages. What do I have to do? Delete this useless page, redirect or do canonical tag? If I have to delete what is the best way to do? Should I use GWT to delete? or just delete from the server? This URL for example: http://www.sexshopone.com.br/?1.2.44.0,0,1,13,0,0,aneis-evolved-boss-cock's.html [admin note: NSFW page} There is no content and it is duplicate in reference of this: http://www.sexshopone.com.br/?1.2.44.0,0,1,12,0,0,aneis-evolved-boss-cock's.html [admin note: NSFW page} and the correct page of the product is: http://www.sexshopone.com.br/?1.2.44.0,423,anel-peniano-evolved-boss-cock's-pleasure-rings-collar-white-reutilizavel-e-a-prova-d'agua-colecao-evolved.html [admin note: NSFW page} What is happening is that we have 8.000 pages like this. Useless and without any content. How do I proceed? Thanks!
On-Page Optimization | | luf07090 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30