Duplicate Page content | What to do?
-
Hello Guys,
I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this:
www.exemple.com/user/login?destination=node/125%23comment-form
What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster?
Thanks in advance!
Pedro Pereira
-
Hi Carly,
It needs to be done to each of the pages. In most cases, this is just a minor change to a single page template. Someone might tell you that you can add an entry to robots.txt to solve the problem, but that won't remove them from the index.
Looking at the links you provided, I'm not convinced you should deindex them all - as these are member profile pages which might have some value in terms of driving organic traffic and having unique content on them. That said I'm not party to how your site works, so this is just an observation.
Hope that helps,
George
-
Hi George,
I am having a similar issue with my site, and was looking for a quick clarification.
We have several "member" pages that have been created as a part of registration (thousands) and they are appearing as duplicate content. When you say add noindex and and a canonical, is this something that needs to be done to every individual page or is there something that can be done that would apply to the thousands of pages at once?
Here are a couple of examples of what the pages look like:
http://loyalty360.org/me/members/8003
http://loyalty360.org/me/members/4641
Thank you!
-
1. If you add just noindex, Google will crawl the page, drop it from the index but it will also crawl the links on that page and potentially index them too. It basically passes equity to links on the page.
2. If you add nofollow, noindex, Google will crawl the page, drop it from the index but it will not crawl the links on that page. So no equity will be passed to them. As already established, Google may still put these links in the index, but it will display the standard "blocked" message for the page description.
If the links are internal, there's no harm in them being followed unless you're opening up the crawl to expose tons of duplicate content that isn't canonicalised.
noindex is often used with nofollow, but sometimes this is simply due to a misunderstanding of what impact they each have.
George
-
Hello,
Thanks for your response. I have learn more which is great
My question is should I add a noindex only to that page or a noidex, nofolow?
Thanks!
-
Yes it's the worst possible scenario that they basically get trapped in SERPs. Google won't then crawl them until you allow the crawling, then set noindex (to remove from SERPS) and then add nofollow,noindex back on to keep them out of SERPs and to stop Google following any links on them.
Configuring URL parameters again is just a directive regarding the crawl and doesn't affect indexing status to the best of my knowledge.
In my experience, noindex is bulletproof but nofollow / robots.txt is very often misunderstood and can lead to a lot of problems as a result. Some SEOs think they can be clever in crafting the flow of PageRank through a site. The unsurprising reality is that Google just does what it wants.
George
-
Hi George,
Thanks for this, It's very interesting... the urls do appear in search results but their descriptions are blocked(!)
Did you try configuring URL parameters in WMT as a solution?
-
Hi Rafal,
The key part of that statement is "we might still find and index information about disallowed URLs...". If you read the next sentence it says: "As a result, the URL address and, potentially, other publicly available information such as anchor text in links to the site can still appear in Google search results".
If you look at moz.com/robots.txt you'll see an entry for:
Disallow: /pages/search_results*
But if you search this on Google:
site:moz.com/pages/search_results
You'll find there are 20 results in the index.
I used to agree with you, until I found out the hard way that if Google finds a link, regardless of whether it's in robots.txt or not it can put it in the index and it will remain there until you remove the nofollow restriction and noindex it, or remove it from the index using webmaster tools.
George
-
George,
I went to check with Google to make sure I am correct and I am!
"While Google won't crawl or index the content blocked by
robots.txt
, we might still find and index information about disallowed URLs from other places on the web." Source: https://support.google.com/webmasters/answer/6062608?hl=enYes, he can fix these problems on page but disallowing it in robots will work fine too!
-
Just adding this to robots.txt will not stop the pages being indexed:
Disallow: /*login?
It just means Google won't crawl the links on that page.
I would do one of the following:
1. Add noindex to the page. PR will still be passed to the page but they will no longer appear in SERPs.
2. Add a canonical on the page to: "www.exemple.com/user/login"
You're never going to try and get these pages to rank, so although it's worth fixing I wouldn't lose too much sleep on the impact of having duplicate content on registration pages (unless there are hundreds of them!).
Regards,
George
-
In GWT: Crawl=> URL Parameters => Configure URL Parameters => Add Parameter
Make sure you know what you are doing as it's easy to mess up and have BIG issues.
-
Add this line to your robots.txt to prevent google from indexing these pages:
Disallow: /*login?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content - default.html
I am showing a duplicate content error in moz. I have site.com and site.com/default.html How can I fix that? Should I use a canonical tag? If so, how would i do that?
On-Page Optimization | | bhsiao0 -
Should extra content be added to item page or resource center?
We run an ecommerce company which sells machines. After the machine is used for X amount of time, we suggest changing the blades in the machines. We have a resource center for customer convenience. We are creating videos and content on how to change the blades in each machine. (each machine is a different process). Do we create videos and content in the resource center and link to the product page or do we beef up our content on the product page by adding that information there? 1 part of us thinks - "The new blade-changing content" is valuable to that product so buyers know the process before buying and hopefully gain some rank juice on the item pages. The other part of us thinks - Keep all resources in the resource center and link to learning resources from the product pages. This version doesn't beef up our product pages but seems to be the logical place to hose the content on the website. Thoughts? Suggestions?
On-Page Optimization | | dkeipper0 -
Duplicate content affects on overall rankings
Hi guys, I have a website that has 23 pages with duplicate content. These pages serve the same function, which enables customers to upload their images. There is not much content on each one but we require a different page for each of our products, here is an example page: http://www.point101.com/giclee_printing/upload#/upload I don't think it makes sense to use a canonical tag as each page is for a different product and I think its going to be difficult to differentiate each page. I was wondering: 1. If this has a negative effect on the ranking of our homepage and other main product pages or if its an issue we do not need to worry too much about. 2. If anyone has any other ideas as to how we can resolve this issue. Thanks,
On-Page Optimization | | KerryK
Kerry0 -
Is there anything wrong with having duplicate description tags if they are relevant to their pages?
I have duplicate description tags, but they make sense for the pages they're on. Is there anything wrong with this? Thanks for reading!
On-Page Optimization | | DA20130 -
Issue: Duplicate Page Content (index.htm)
I get an error of "**Issue:**Duplicate Page Content" for the following pages in the SEOMOZ Crawl Diagnostics. But these pages are the same one! Duhhhh.... Is there a way to hide this false error? http://www.stdtime.com/ http://www.stdtime.com/index.htm BTW, I also get "**Issue:**Duplicate Page Title" for this page. Another false error...
On-Page Optimization | | raywhite0 -
Duplicate page content & title for www.mydomain.com and www.mydomain.com/index.php?
Hi, First post so please be gentle! My Crawl Diagnostics Summary is showing an error relating to duplicate page content and duplicate page title for www.mydomain.com and www.mydomain.com/index.php which are, in my view, the same thing/page? Could anyone shed any light please? Thanks Carl
On-Page Optimization | | Carl2870 -
Duplicate Content using templates
Hi, Our web site is designed using a template, which means the header and footer is consistent across all pages. Only the body content is unique on each page. Is the google bot able to see that the header and footer content is defined by the common template? Will this have any impact in terms of duplicate content? For example, we have a two line text in the footer that summarize the services we provide. Because the same text is in the footer of all pages, i am concerned about creating duplicate content. Finally, does it make sense to include keywords in header and footer of the template? Will it have any positive or negative SEO impact?
On-Page Optimization | | petersen0 -
Archetecture to avoid content duplicate
Hi, I have lots of duplicate stuff and I need a better site architecture. http://www.furnacefilterscanada.com/ We are selling furnace filters. All furnace filters are sold in 50 different sizes, each sizes comes in 3 different qualities, Bronze, Silver and Gold. Total: 150 products. Right now I have created many categories and subcategories for furnace filters sizes. When the client pickup is sizes, he will end-up to the products page with 3 different options, Bronze, Silver and Gold. They can then compare the filter a select the one he wants to purchase. The problem is, it is not possible to provide different content for each filters, Gold has a description, Silver has another one and also Bronze. The only text that will change in the descriptions, is the filter size. This makes Duplicates text description. Not good when you what to index your site. The positive things to 150 different products, is the page title. example 16x25x4 furnace filters. Those exacte tem get search in Google. A new site architecture with 3 categories, Gold, Silver and Bronze & 50 variables by products (filters sizes) might not be the best options, because no filter size will be index. Can you please help me to find the best architecture in a SEO point of view? Also what about the top navigation bar menu, what is the best options in using it? Right now it is use for Legal, Contact, Policy and I fill it is a wast, those page only get less then 1% clicks. It might be more convenient to use those for categories for example, what is your recommendations in a SEO point of view? Can I create a information page in the left navigation menu and includ all the standard page, like: Policy, Legal ... If I do, will I get penalize by Google? Thank you for your help. We have puts lots of money in AdWords before, but now the next step is to come home organics. I'm using SEOmoz tools, read there new book, and I want increase traffic. I just need your help. Thank you, BigBlaze
On-Page Optimization | | BigBlaze2050