Duplicate page content and Duplicate page title errors
-
Hi,
I'm new to SeoMoz and to this forum. I've started a new campaign on my site and got back loads of error.
Most of them are Duplicate page content and Duplicate page title errors. I know I have some duplicate titles but I don't have any duplicate content.
I'm not a web developer and not so expert but I have the impression that the crawler is following all my internal links (Infact I have also plenty of warnings saying "Too many on-page links".
Do you think this is the cause of my errors? Should I implement the nofollow on all internal links? I'm working with Joomla.
Thanks a lot for your help
Marco
-
Hi Marco,
I took a look at your page at http://www.beautifulpuglia.com/it/linea-costiera/isole-tremiti.html
Looks like you've got the canonical in place okay here. The next step is to add the canonical on every page that is a duplicate of this page. And you want to make sure to point to the right page. Let me be clear: Every page that is a duplicate of this page should have the same canonical. In this case:
<link rel=”canonical” href=[”http://www.beautifulpuglia.com/it/gargano/isole-tremiti.html”/](view-source:http://www.beautifulpuglia.com/linea-costiera/%E2%80%9Dhttp://www.beautifulpuglia.com/it/gargano/isole-tremiti.html%E2%80%9D/)>
You can find the other pages you need to add this tag to in your SEOmoz report. In each duplicated content report, it will list the number of other pages that are duplicates. Simply click on the number to see the URLs.
I'm not a Joomla expert, but webmasters I've talked to have expressed that other platforms such as Wordpress and Drupal are much more accommodating of these types of fixes. There are some various plugin modules you can use, but you'll have to select one appropriate to your configuration.
Here's a good resource from Dr. Pete: http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
Hope this helps. Best of luck.
-
Elias,
I too have 'thousands' of duplicated errors in SEOmoz. Most of which are because it is returning
/abc.com as a different page to /ABC.com
Surely Google doesn't do that? Just because one URL is in capital and the other small case? I also have no idea where SEOmoz is picking that up from......possibly links internal to the page with the hyperlink using different case?
It seems to me this is too sensitive and for me to fix that would take WEEKS!!!! I fail to see if there would be any uplift if Google sees beyond that issue as its cosmetic and not functional.
Regards
Andy
-
It looks fine to me. You will need to do the same on all of your pages.
If you've just added the code you will need to wait up to a week for SEOmoz to re-crawl your website depending on when you're site crawl is scheduled.
Let me know how you get on.
Elias
-
Hi Elias, Hi Marisa,
thanks you both
you are right, in the meantime I had done this but I have the impression it is not working and I don't know what I'm doing wrong.
I'm attaching a link to a page of my site (I hope I can do this). Please have a look at the code, you will see the tag rel=”canonical” href=”http://www.beautifulpuglia.com/it/gargano/isole-tremiti.html”/> which is indicating the URL I want to use. However SeoMoz is still giving me the error. And this is happening for both the Italian and English version.
So far I've only added the tag to this page, I want to find the solution before modifying all pages currently affected.
http://www.beautifulpuglia.com/it/linea-costiera/isole-tremiti.html
Thanks a lot again
-
Hi Marco, as Marissa says - by putting the canonical tag on one page you are putting it on all of them as they are in fact the same page - they are just reached by different URLs.
-
www.site.com/ and www.site.com/index.html, site.com/index.html/, ect, are already the same page. So, there's only one page TO put the tag on. You're just telling the crawlers that you only want one of them to get the credit, and which version of the page you prefer to be displayed.
-
Hi Elias,
thanks a lot for your reply. I've read few posts about the canonical tag and Yes I'm going to try it.
Just couple of things:
-
Let's say I have 4 duplicate for one page, I presume I have to add the tag in the head of only one page right? Does it make any difference which one I pick?
-
Any idea on how this can be implemented in Joomla?It doesn't seem to be very straightforward.
Thanks a lot
Marco
-
-
Hi Marco,
It seems to me like you need to implement the canonical tag.
Site crawlers/bots will consider the following pages as different pages because of their URL and thus tell indicate to them that the content is duplicated on each page...
By implementing the following tag on each of your sites pages (changing the URL for each page) you will tell the crawler which page they should be indexing and to ignore the other.
Here's an example of a canonical tag (to be placed within the head tag of the page)
I think this will sort out your duplication issues.
You can find more information about canonical URLs here http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
I hope this helps!
Elias
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big retailers and duplicate content
Hello there! I was wondering if you guys have experience with big retailers sites fetching data via API (PDP content etc.) from another domain which is also sharing the same data with other multiple sites. If each retailer has thousands on products, optimizing PDP content (even in batches) is quite of a cumbersome task and rel="canonical" pointing to original domain will dilute the value. How would you approach this type of scenario? Looking forward to read your suggestions/experiences Thanks a lot! Best Sara
Intermediate & Advanced SEO | | SaraCoppola1 -
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
Pages with Duplicate Page Content (with and without www)
How can we resolve pages with duplicate page content? With and without www?
Intermediate & Advanced SEO | | directiq
Thanks in advance.0 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
How can you indexed pages or content on pages that are behind a pay wall or subscription login.
I have a client that has a boat of awesome content they provide to their client that's behind a pay wall ( ie: paid subscribers can only access ) Any suggestions mozzers? How do I get those pages index? Without completely giving away the contents in the front end.
Intermediate & Advanced SEO | | BizDetox0 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0 -
202 error page set in robots.txt versus using crawl-able 404 error
We currently have our error page set up as a 202 page that is unreachable by the search engines as it is currently in our robots.txt file. Should the current error page be a 404 error page and reachable by the search engines? Is there more value or is it a better practice to use 404 over a 202? We noticed in our Google Webmaster account we have a number of broken links pointing the site, but the 404 error page was not accessible. If you have any insight that would be great, if you have any questions please let me know. Thanks, VPSEO
Intermediate & Advanced SEO | | VPSEO0 -
Affiliate Site Duplicate Content Question
Hi Guys I have been un-able to find a definite answer to this on various forums, your views on this will be very valuable. I am doing a few Amazon affiliate sites and will be pulling in product data from Amazon via a Wordpress plugin. The plugin pulls in titles, descriptions, images, prices etc, however this presents a duplicate content issue and hence I can not publish the product pages with amazon descriptions. Due to the large number of products, it is not feasible to re-write all descriptions, but I plan re-write descriptions and titles for 50% of the products and publish then with “index, follow” attribute. However, for the other 50%, what would be the best way to handle them? Should I publish them as “noindex,follow”? **- Or is there another solution? Many thanks for your time.**
Intermediate & Advanced SEO | | SamBuck0