Issue: Duplicate Page Content
-
Hello SEO experts,
I'm facing duplicate page content issue on my website. My website is a apartments rental website when client search apartment for availability. Automatic generate same url's. I've already block these url's in robots.txt file but facing same issue. Kindly guide me what can I do.
Here are some example links.
http://availability.website.com/booking.php?id=17&bid=220
http://availability.website.com/booking.php?id=17&bid=242
http://availability.website.com/booking.php?id=18&bid=214
http://availability.website.com/booking.php?id=18&bid=215
http://availability.website.com/booking.php?id=18&bid=256
http://availability.website.com/details.php?id=17&bid=220
http://availability.website.com/details.php?id=17&bid=242
http://availability.website.com/details.php?id=17&pid=220&bid=220
http://availability.website.com/details.php?id=17&pid=242&bid=242
http://availability.website.com/details.php?id=18&bid=214
http://availability.website.com/details.php?id=18&bid=215
http://availability.website.com/details.php?id=18&bid=256
http://availability.website.com/details.php?id=18&pid=214&bid=214
http://availability.website.com/details.php?id=18&pid=215&bid=215
http://availability.website.com/details.php?id=18&pid=256&bid=256
http://availability.website.com/details.php?id=3&bid=340
http://availability.website.com/details.php?id=3&pid=340&bid=340
http://availability.website.com/details.php?id=4&bid=363
http://availability.website.com/details.php?id=4&pid=363&bid=363
http://availability.website.com/details.php?id=6&bid=367
http://availability.website.com/details.php?id=6&pid=367&bid=367
http://availability.website.com/details.php?id=8&bid=168
http://availability.website.com/details.php?id=8&pid=168&bid=168Thanks and waiting for your response
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | -
You should probably set the www.website.com/the-mayflower/ version as the canonical. So the page on the subdomain, and all other copies, would ALL have a rel canonical tag that points to www.website.com/the-mayflower/.
I wouldn't block the subdomain based on what you've said, but complicated issues like this are difficult to fully diganose and prescribe fixes without seeing the site.
-
Thanks for your response. I've forget to mention one more thing in my question. I've same properties in main and sub domain. Example
Here is one property on main page.
http://www.website.com/the-mayflower/
and now same property on sub domain for availability.
http://availability.website.com/property.php?id=1
Maybe I'm facing duplicate page content and title issue for this reason. Can I block subdomain for search engine crawl?
-
You also need to find a way to stop this from happening. Ounce of prevention!
-
Agreed - robots.txt is not the way to go on this. Also, you can configure parameter handling in GWT to help with this.
-
I would make sure to use the rel="canonical" tag to designate which URL Google should consider to be the primary URL, regardless of any parameters appended to it. Here is some additional information -
https://support.google.com/webmasters/answer/139394?hl=en
http://googlewebmastercentral.blogspot.com/2013/04/5-common-mistakes-with-relcanonical.html
I would also recommend not using robots.txt in this case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content from page links
So for the last month or so I have been going through fixing SEO content issues on our site. One of the biggest issues has been duplicate content with WHMCS. Some have been easy and other have been a nightmare trying to fix. Some of the duplicate content has been the login page when a page requires a login. For example knowledge base article that are only viewable by clients etc. Easily fixed for me as I dont really need them locked down like that. However pages like affiliate.php and pwreset.php that are only linked off of a page. I am unsure how to take care of these types. Here are some pages that are being listed as duplicate: Should this type of stuff be a 301 redirect to cart.php or would that break something. I am guessing that everything should point back to cart.php.
On-Page Optimization | | blueray
https://www.bluerayconcepts.com/brcl...art.php?a=view
https://www.bluerayconcepts.com/brcl...php?a=checkout These are the ones that are really weird to me. These are showing as duplicate content but pwreset is only a link of the KB category. It shows up as duplicate many times as does affilliate.php: https://www.bluerayconcepts.com/brcl...ebase/16/Email
https://www.bluerayconcepts.com/brcl...16/pwreset.php Any help is overly welcome.0 -
How to optimize WordPress Pages with Duplicate Page Content?
I found the non WWW ans WWW duplicate pages URL only, more than thousand pages.
On-Page Optimization | | eigital0 -
Duplicate pages
Hi I have recently signed up to Moz Pro and the first crawl report on my wordpress site has brought up some duplicate content issues. I don't know what to do with this data! The original page : http://www.dwliverpoolphotography.co.uk/blog/ and the duplicate content page : http://www.dwliverpoolphotography.co.uk/author/david/ If anyone can point me to a resource or explain what I need to do thanks! David.
On-Page Optimization | | WallerD0 -
Duplicate Issue
Hello Mozzers! We have a client going through a website revamp. The client is The Michelangelo Hotel, and they are part of Star Hotels. Star Hotels plans to create a section on their site for The Michelangelo, as opposed to maintaining a stand alone site. They will then take the michelangelohotel.com domain, and point it to the corresponding pages on the Star site. The guest will key in www.michelangelohotel.com, and will see the same content that can be found on www.starhotel.com/en/michelangelo-hotel-new-york. The problem we have is this: Essentially the same content will be indexed twice, once on starhotels.com and once on michelangelohotel.com. This would seem to cause a duplicate content issue. What are your thoughts? Edit: I apologize, because I was not nearly clear enough here. The Star Hotels site will have 5 pages dedicated to The Michelangelo Hotel. The content will sit solely on that server as those 5 pages. Those 5 pages will each be indexed as 2 URLs. www.michelangelohotel.com <-> www.starhotels.com/en/michelangelo/ www.michelangelohotel.com/accommodations <-> www.starhotels.com/en/michelangelo/accommodations And so on. Thanks!
On-Page Optimization | | FrankSweeney0 -
Content with changing URL and duplicate content
Hi everyone, I have a question regarding content (user reviews), that are changing URL all the time. We get a lot of reviews from users that have been dining at our partner restaurants, which get posted on our site under (new) “reviews”. My worry however is that the URL for these reviews is changing all the time. The reason for this is that they start on page 1, and then get pushed down to page 2, and so on when new reviews come in. http://www.r2n.dk/restaurant-anmeldelser I’m guessing that this could cause for serious indexing problems? I can see in google that some reviews are indexed multiple times with different URLs, and some are not indexed at all. We further more have the specific reviews under each restaurant profile. I’m not sure if this could be considered duplicate content? Maybe we should tell google not to index the “new reviews section” by using robots.txt. We don’t get much traffic on these URLs anyways, and all reviews are still under each restaurant-profile. Or maybe the canonical tag can be used? I look forward to your input. Cheers, Christian
On-Page Optimization | | Christian_T2 -
Duplicate Page Content on Empty Manufacturer Pages
I work for an internet retailer that specializes in pet supplies and medications. I was going through the Crawl Diagnostics for our website, and I saw in the Duplicate Page Content section that some of our manufacturer pages were getting flagged. The way our site is set up is that when products are discontinued we mark them as discontinued and use 301 redirects to redirect their URLs to other relevant products, brands, or our homepage. We do the same thing with brand and manufacturer pages if all of their products are discontinued. 90% of the time, this is a manual process. However, the other 10% of the time certain products come and go automatically as part of our inventory system with one of our fulfillment partners. This can sometimes create empty manufacturer pages. I can't redirect these empty pages because there's a chance that products will be brought back in stock and the page will be populated again. What can we do so that these pages won't get marked as duplicates while they're empty? Write unique short descriptions about the companies? Would the placement of these short descriptions matter--top of the page under the category name vs bottom of the page underneath where the products would go? The links in the left sidebar, top, and in the footer our part of our site architecture, so those are always going to be the same. To contrast, here's what a manufacturer page with products looks like: Thanks! http://www.vetdepot.com/littermaid-manufacturer.html
On-Page Optimization | | ElDude0 -
Duplicate content with a trailing slash /
Hi, I 've pages like this: A) www.example.com/file/ B) www.example.com/file Just two questions: Does Google see this as duplicate content? Best to 301 redirect B to A? Many thanks Richard PS I read previous threads re the subject, it sounded like there was a bug in SEOMoz but I was not absolutely clear. Apologies if this is going over old ground.
On-Page Optimization | | Richard5550 -
Article on site and distribution, is it duplicate content?
I was always taught to place all original articles on site, let them get indexed by Google, then put out for distribution through various press release outlets. With the latest penguin update, how does this practice work out concerning duplicate content? In theory, I wrote the article so I should get credit for it on my site first, then push through various distribution outlets to get it out to my targeted audience in my niche field. Typing out loud I would tend to think if the article is on my site first then I would get credit and any others following would be hit by duplicate content if in fact google considered it a dupe violation. Any input on this? Am I on track or am I heading for a train wreck.
On-Page Optimization | | anthonytjm0