Duplicate page content and Duplicate page title errors
-
Hi,
I'm new to SeoMoz and to this forum. I've started a new campaign on my site and got back loads of error.
Most of them are Duplicate page content and Duplicate page title errors. I know I have some duplicate titles but I don't have any duplicate content.
I'm not a web developer and not so expert but I have the impression that the crawler is following all my internal links (Infact I have also plenty of warnings saying "Too many on-page links".
Do you think this is the cause of my errors? Should I implement the nofollow on all internal links? I'm working with Joomla.
Thanks a lot for your help
Marco
-
Hi Marco,
I took a look at your page at http://www.beautifulpuglia.com/it/linea-costiera/isole-tremiti.html
Looks like you've got the canonical in place okay here. The next step is to add the canonical on every page that is a duplicate of this page. And you want to make sure to point to the right page. Let me be clear: Every page that is a duplicate of this page should have the same canonical. In this case:
<link rel=”canonical” href=[”http://www.beautifulpuglia.com/it/gargano/isole-tremiti.html”/](view-source:http://www.beautifulpuglia.com/linea-costiera/%E2%80%9Dhttp://www.beautifulpuglia.com/it/gargano/isole-tremiti.html%E2%80%9D/)>
You can find the other pages you need to add this tag to in your SEOmoz report. In each duplicated content report, it will list the number of other pages that are duplicates. Simply click on the number to see the URLs.
I'm not a Joomla expert, but webmasters I've talked to have expressed that other platforms such as Wordpress and Drupal are much more accommodating of these types of fixes. There are some various plugin modules you can use, but you'll have to select one appropriate to your configuration.
Here's a good resource from Dr. Pete: http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
Hope this helps. Best of luck.
-
Elias,
I too have 'thousands' of duplicated errors in SEOmoz. Most of which are because it is returning
/abc.com as a different page to /ABC.com
Surely Google doesn't do that? Just because one URL is in capital and the other small case? I also have no idea where SEOmoz is picking that up from......possibly links internal to the page with the hyperlink using different case?
It seems to me this is too sensitive and for me to fix that would take WEEKS!!!! I fail to see if there would be any uplift if Google sees beyond that issue as its cosmetic and not functional.
Regards
Andy
-
It looks fine to me. You will need to do the same on all of your pages.
If you've just added the code you will need to wait up to a week for SEOmoz to re-crawl your website depending on when you're site crawl is scheduled.
Let me know how you get on.
Elias
-
Hi Elias, Hi Marisa,
thanks you both
you are right, in the meantime I had done this but I have the impression it is not working and I don't know what I'm doing wrong.
I'm attaching a link to a page of my site (I hope I can do this). Please have a look at the code, you will see the tag rel=”canonical” href=”http://www.beautifulpuglia.com/it/gargano/isole-tremiti.html”/> which is indicating the URL I want to use. However SeoMoz is still giving me the error. And this is happening for both the Italian and English version.
So far I've only added the tag to this page, I want to find the solution before modifying all pages currently affected.
http://www.beautifulpuglia.com/it/linea-costiera/isole-tremiti.html
Thanks a lot again
-
Hi Marco, as Marissa says - by putting the canonical tag on one page you are putting it on all of them as they are in fact the same page - they are just reached by different URLs.
-
www.site.com/ and www.site.com/index.html, site.com/index.html/, ect, are already the same page. So, there's only one page TO put the tag on. You're just telling the crawlers that you only want one of them to get the credit, and which version of the page you prefer to be displayed.
-
Hi Elias,
thanks a lot for your reply. I've read few posts about the canonical tag and Yes I'm going to try it.
Just couple of things:
-
Let's say I have 4 duplicate for one page, I presume I have to add the tag in the head of only one page right? Does it make any difference which one I pick?
-
Any idea on how this can be implemented in Joomla?It doesn't seem to be very straightforward.
Thanks a lot
Marco
-
-
Hi Marco,
It seems to me like you need to implement the canonical tag.
Site crawlers/bots will consider the following pages as different pages because of their URL and thus tell indicate to them that the content is duplicated on each page...
By implementing the following tag on each of your sites pages (changing the URL for each page) you will tell the crawler which page they should be indexing and to ignore the other.
Here's an example of a canonical tag (to be placed within the head tag of the page)
I think this will sort out your duplication issues.
You can find more information about canonical URLs here http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
I hope this helps!
Elias
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirect to avoid duplicate content penalty
I have two websites with identical content. Haya and ethnic Both websites have similar products. I would like to get rid of ethniccode I have already started to de-index ethniccode. My question is, Will I get any SEO benefit or Will it be harmful if I 301 direct the below only URL’s https://www.ethniccode/salwar-kameez -> https://www.hayacreations/collections/salwar-kameez https://www.ethniccode/salwar-kameez/anarkali-suits - > https://www.hayacreations/collections/anarkali-suits
Intermediate & Advanced SEO | | riyaaaz0 -
Duplicate page content on numerical blog pages?
Hello everyone, I'm still relatively new at SEO and am still trying my best to learn. However, I have this persistent issue. My site is on WordPress and all of my blog pages e.g page one, page two etc are all coming up as duplicate content. Here are some URL examples of what I mean: http://3mil.co.uk/insights-web-design-blog/page/3/ http://3mil.co.uk/insights-web-design-blog/page/4/ Does anyone have any ideas? I have already no indexed categories and tags so it is not them. Any help would be appreciated. Thanks.
Intermediate & Advanced SEO | | 3mil0 -
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
Same content pages in different versions of Google - is it duplicate>
Here's my issue I have the same page twice for content but on different url for the country, for example: www.example.com/gb/page/ and www.example.com/us/page So one for USA and one for Great Britain. Or it could be a subdomain gb. or us. etc. Now is it duplicate content is US version indexes the page and UK indexes other page (same content different url), the UK search engine will only see the UK page and the US the us page, different urls but same content. Is this bad for the panda update? or does this get away with it? People suggest it is ok and good for localised search for an international website - im not so sure. Really appreciate advice.
Intermediate & Advanced SEO | | pauledwards0 -
Blog Duplicate Content
Hi, I have a blog, and like most blogs I have various search options (subject matter, author, archive, etc) which produce the same content via different URLs. Should I implement the rel-canonical tag AND the meta robots tag (noindex, follow) on every page of duplicate blog content, or simply choose one or the other? What's best practice? Thanks Mozzers! Luke
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate Content On A Subdomain
Hi, We have a client who is currently close to completing a site specifically aimed at the UK market (they're doing this in-house so we've had no say in how it will work). The site will almost be a duplicate (in terms of content, targeted keywords etc.) of a section of the main site (that sits on the root domain) - the main site is targeted toward the US. The only difference will be certain spellings and currency type. If this new UK site were to sit on a sub domain of the main site, which is a .com, will this cause duplicate content issues? I know that there wouldn't be an issue if the new site were to be on a separate .co.uk domain (according to Matt Cutts), but it looks like the client wants it to be on a sub domain. Any help/advice would be greatly appreciated.
Intermediate & Advanced SEO | | jasarrow0 -
How to manage duplicate content?
I have a real estate site that contains a large amount of duplicate content. The site contains listings that appear both on my clients website and on my competitors websites(who have better domain authority). It is critical that the content is there because buyers need to be able to find these listings to make enquiries. The result is that I have a large number pages that contain duplicate content in some way, shape or form. My search results pages are really the most important ones because these are the ones targeting my keywords. I can differentiate these to some degree but the actual listings themselves are duplicate. What strategies exist to ensure that I'm not suffereing as a result of this content? Should I : Make the duplicate content noindex. Yes my results pages will have some degree of duplicate content but each result only displays a 200 character summary of the advert text so not sure if that counts. Would reducing the amount of visible duplicate content improve my rankings as a whole? Link back to the clients site to indicate that they are the original source Any suggestions?
Intermediate & Advanced SEO | | Mulith0 -
Nuanced duplicate content problem.
Hi guys, I am working on a recently rebuilt website, which has some duplicate content issues that are more nuanced than usual. I have a plan of action (which I will describe further), so please let me know if it's a valid plan or if I am missing something. Situation: The client is targeting two types of users: business leads (Type A) and potential employees (Type B), so for each of their 22 locations, they have 2 pages - one speaking to Type A and another to Type B. Type A location page contains a description of the location. In terms of importance, Type A location pages are secondary because to the Type A user, locations are not of primary importance. Type B location page contains the same description of the location plus additional lifestyle description. These pages carry more importance, since they are attempting to attract applicants to work in specific places. So I am planning to rank these pages eventually for a combination of Location Name + Keyword. Plan: New content is not an option at this point, so I am planning to set up canonical tags on both location Types and make Type B, the canonical URL, since it carries more importance and more SEO potential. The main nuance is that while Type A and Type B location pages contain some of the same content (about 75%-80%), they are not exactly the same. That is why I am not 100% sure that I should canonicalize them, but still most of the wording on the page is identical, so... Any professional opinion would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | naymark.biz0