"Duplicate without user-selected canonical” - impact to SERPs
-
Hello, we are facing some issues on our project and we would like to get some advice.
Scenario
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains.Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links.Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default":<link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" />
Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example):User-declared canonical: None
Google-selected canonical: …same page, but on a different domainStrange is that even those URLs are on Google and can be found in SERPs.
Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google.
Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario?
Any suggestions/ideas appreciated, thanks. Regards.
-
The issue of "Duplicate without user-selected canonical" refers to situations where there are multiple identical or very similar pages on a website, but a canonical tag has not been explicitly set to indicate which version should be considered the preferred or original version by search engines.
The impact of this issue on search engine results pages (SERPs) can be negative for several reasons:
Keyword Dilution: When search engines encounter multiple versions of the same or similar content, they might have a hard time determining which page to rank for a particular keyword. This can lead to keyword dilution, where the authority and relevance of the content is spread across multiple pages instead of being concentrated on a single page.
Page Selection Uncertainty: Without a canonical tag to guide search engines, they may choose to index and display a version of the page that is not the most relevant or valuable to users. This can result in users landing on less optimal pages from their search queries.
Ranking Competition: Duplicate content can cause internal competition between your own pages for rankings. Instead of consolidating ranking signals onto one page, they get divided among duplicates, potentially leading to lower overall rankings for all versions.
Crawling and Indexing Issues: Search engine bots may spend more time crawling and indexing duplicate content, which could lead to inefficient use of their resources. This might affect how often your new or updated content gets indexed.
To address the "Duplicate without user-selected canonical" issue and mitigate its impact on SERPs:
Implement Canonical Tags: Set up canonical tags on duplicate or similar pages to indicate the preferred version. This guides search engines to consolidate ranking signals and direct traffic to the correct page.
301 Redirects: If possible, redirect duplicate pages to a single, canonical version using 301 redirects. This not only consolidates ranking signals but also ensures that users are directed to the most relevant content.
Consolidate Content: Consider merging similar pages into a single, comprehensive page. This helps avoid duplication issues and improves the overall user experience.
Use Noindex Tags: If some duplicate pages are not crucial for SEO or user experience, you can add a noindex meta tag to prevent search engines from indexing those pages.
Monitor and Update: Regularly audit your website for duplicate content and ensure that new content is properly canonicalized to prevent future occurrences.
By addressing the "Duplicate without user-selected canonical" issue, you can help improve the clarity and accuracy of how your content appears in SERPs, potentially leading to better rankings and a more effective SEO strategy.
-
@GeorgeJohn727
Duplicate without user-selected canonical -
Understanding 'Duplicate without user-selected canonical' is crucial for optimizing SERPs. This issue can lead to content duplication concerns, potentially affecting search engine rankings. Just as addressing this matter ensures streamlined search results, exploring the 'best online betting sites in India' exemplifies how selecting the right canonical source enhances visibility and credibility in the online domain.
-
- Even if this error occurs it doesn't mean Google ignores the pages - it can and in our case they appear in SERPs.
- Duplicate pages carry value in sense that there is a slight alteration for local market - contact info, different pricing, etc. So 90% of the page is same on national
domains, but only slight part differs.
-
@alex_pisa
The error "Duplicate without user-selected canonical” indicates that Google found duplicate URLs that are not canonicalized to a preferred version. Google didn't index these duplicate URLs and assigned a canonical version on its own.How to fix this issue
Should these pages even exist? If the answer to this is no, simply remove these pages and return a HTTP status code 410.
If these pages have a purpose, then ask yourself whether they
carry any value:-
If yes, then canonicalize them to the preferred version of the URL. Need some inspiration where to canonicalize to? See which URL Google finds most relevant using the URL Inspection tool(opens in a new tab). If Google's listing PDF files for your site, canonicalize them through the HTTP header.
-
If these pages don't carry any value, then make sure to apply the noindex directive through the meta robots tag or X-Robots-Tag HTTP Header.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
Site is generating long path URLs
Hi, We've seen recently in Search Console Coverage report that website is generating long path URLs that we actually don't have.
Technical SEO | | eUniverse
Here's an example: https://autocovers.co.uk/car-mats/outdoor-basic/indoor-car-covers/shop/contact-us/shipping-delivery/about-us/about-us/indoor-car-covers/ Does anybody knows what's the issue behind it? Thanks!0 -
Sponsor in flight ticket section
How can I be included in http://www.google.com/flights section in Serp for other languages like Persian e.g the keyword: بلیط هواپیما
International SEO | | fareli0 -
How to avoid duplication across multiple country domains
Here's the scenario: I have a client currently running one Shopify site (AU) They want to launch three more country domains (US, UK and EU) They want each to be a standalone site, primarily so the customers can purchase in their local currency, which is not possible from a single Shopify site The inventory is all from the same source The product desscriptions will all be the same as well Question: How do we avoid content duplication (ie. how will canonical tags work in this scenario)?
International SEO | | muzzmoz0 -
Canonical and hreflang mess of international desktop and mobile site versions
Hello, I have an interesting case and I am lost in it. There are two versions of the site: desktop and mobile. And there are also international versions: English and Spanish. I'm stuck at implementation of canonical tags. Currently my setup has the following: English (default) desktop page has these: English Mobile page has these: Spanish Desktop version: Spanish Mobile version: But I somewhat feel that I messed the things... Could you guys point me to what I did wrong and explain how to set it right? Also, if you know URLs of blog posts or articles, where similar case is explained - share with me please.
International SEO | | poiseo0 -
Is International Geotargeting with Duplicate Content Effective?
A company located in Canada is currently targeting Canada through the geotargeting setting in Google Webmaster Tools. Google.ca rankings are good, but Google.com rankings are not. The company would like to gain more traction for US people using google.com. The idea on the table is to set up a subfolder www.domain.com/us/ and use WMT to designate this version for the US. Here's the kicker: the content is exactly the same. Will Google consider the US version duplicate content? Is this an effective way to target US and Canada at the same time? Is it better to forget a duplicate US site altogether and use the "unlisted" setting in WMT?
International SEO | | AliveWired0 -
Backlinks that we didnt place, killing our SERP rank and PR
I am in need of advice regarding back links that we did not place, and which are hurting our search engine results. How and why they got there I cannot explain.But they have appeared recently, and are damaging our SERP ranks. For several years, I have been a member with SEOmoz, and we have done our search engine optimization in house. I am the owner of a personal injury law firm, which is a competitive field in search engine optimization. Recently, in the spring, we updated our website, added significant content (over 100 additional pages), we set up a better site structure, and we completed a significant back link campaign from white hat sources. As a result, we were the strongest law firm in search engine results in the state of Arizona, and the page rank from our home page went from a 4 to 6, and from our next highest level of pages, they went from a 3 to 6. This happened in 10 week period. Our search engine results were fantastic. We were getting a significant amount of business from out Places page and our Organic results. That has almost completely dried up. Approximately 6-8 weeks later, we started having some serious problems. Specifically, our search engine results decreased significantly, our page rank reduced from a six to a four. So we started using SEOmoz tools to see what the problem was, and when we created an open site Explorer report, there are approximately 1000 different links from very shady websites they are now linking to our home page. Some of these linking URLs prompt a download to video and other files. Other of the linking are simple on junk sites. Obviously, some other person placed these links. First and foremost I am interested in maintaining the integrity of our site, and if there were a way to remove these links, and protect against that in the future, that is what I want. Secondly, if there were a way to find out who did this, I would like to know that also. What options and/or actions should be taken. I am thinking that I may need to employ a professional/consultant. Will I have to transfer content to another domain? Your thought and help are appreciated. Thanks,
International SEO | | MFC0