"Duplicate without user-selected canonical” - impact to SERPs
-
Hello, we are facing some issues on our project and we would like to get some advice.
Scenario
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains.Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links.Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default":<link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" />
Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example):User-declared canonical: None
Google-selected canonical: …same page, but on a different domainStrange is that even those URLs are on Google and can be found in SERPs.
Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google.
Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario?
Any suggestions/ideas appreciated, thanks. Regards.
-
The issue of "Duplicate without user-selected canonical" refers to situations where there are multiple identical or very similar pages on a website, but a canonical tag has not been explicitly set to indicate which version should be considered the preferred or original version by search engines.
The impact of this issue on search engine results pages (SERPs) can be negative for several reasons:
Keyword Dilution: When search engines encounter multiple versions of the same or similar content, they might have a hard time determining which page to rank for a particular keyword. This can lead to keyword dilution, where the authority and relevance of the content is spread across multiple pages instead of being concentrated on a single page.
Page Selection Uncertainty: Without a canonical tag to guide search engines, they may choose to index and display a version of the page that is not the most relevant or valuable to users. This can result in users landing on less optimal pages from their search queries.
Ranking Competition: Duplicate content can cause internal competition between your own pages for rankings. Instead of consolidating ranking signals onto one page, they get divided among duplicates, potentially leading to lower overall rankings for all versions.
Crawling and Indexing Issues: Search engine bots may spend more time crawling and indexing duplicate content, which could lead to inefficient use of their resources. This might affect how often your new or updated content gets indexed.
To address the "Duplicate without user-selected canonical" issue and mitigate its impact on SERPs:
Implement Canonical Tags: Set up canonical tags on duplicate or similar pages to indicate the preferred version. This guides search engines to consolidate ranking signals and direct traffic to the correct page.
301 Redirects: If possible, redirect duplicate pages to a single, canonical version using 301 redirects. This not only consolidates ranking signals but also ensures that users are directed to the most relevant content.
Consolidate Content: Consider merging similar pages into a single, comprehensive page. This helps avoid duplication issues and improves the overall user experience.
Use Noindex Tags: If some duplicate pages are not crucial for SEO or user experience, you can add a noindex meta tag to prevent search engines from indexing those pages.
Monitor and Update: Regularly audit your website for duplicate content and ensure that new content is properly canonicalized to prevent future occurrences.
By addressing the "Duplicate without user-selected canonical" issue, you can help improve the clarity and accuracy of how your content appears in SERPs, potentially leading to better rankings and a more effective SEO strategy.
-
@GeorgeJohn727
Duplicate without user-selected canonical -
Understanding 'Duplicate without user-selected canonical' is crucial for optimizing SERPs. This issue can lead to content duplication concerns, potentially affecting search engine rankings. Just as addressing this matter ensures streamlined search results, exploring the 'best online betting sites in India' exemplifies how selecting the right canonical source enhances visibility and credibility in the online domain.
-
- Even if this error occurs it doesn't mean Google ignores the pages - it can and in our case they appear in SERPs.
- Duplicate pages carry value in sense that there is a slight alteration for local market - contact info, different pricing, etc. So 90% of the page is same on national
domains, but only slight part differs.
-
@alex_pisa
The error "Duplicate without user-selected canonical” indicates that Google found duplicate URLs that are not canonicalized to a preferred version. Google didn't index these duplicate URLs and assigned a canonical version on its own.How to fix this issue
Should these pages even exist? If the answer to this is no, simply remove these pages and return a HTTP status code 410.
If these pages have a purpose, then ask yourself whether they
carry any value:-
If yes, then canonicalize them to the preferred version of the URL. Need some inspiration where to canonicalize to? See which URL Google finds most relevant using the URL Inspection tool(opens in a new tab). If Google's listing PDF files for your site, canonicalize them through the HTTP header.
-
If these pages don't carry any value, then make sure to apply the noindex directive through the meta robots tag or X-Robots-Tag HTTP Header.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Can I influence the Google Selected Canonical
Our company recently rebranded and launched a new website. The website was developed by an overseas team and they created the test site on their subdomain. The only problem is that Google crawled and indexed their site and ours. I noticed Google indexed their sub domain ahead of our domain and based on Search Console it has deemed our content as the duplicate of theirs and the Google selected theirs as the canonical.
Community | | Spaziohouston
The website in question is https://www.spaziointerni.us
What would be the best course of action to get our content ranked and selected instead of being marked as the duplicate?
Not sure if I have to modify the content to make it more unique or have them submit a removal in their search console.
Our indexed pages continue to go down due to this issue.
Any help is greatly appreciated.1 -
my site has 17.5m Total links according to Moz (16.6m internal follow & 840k no follow) i think i have a problem...
We are hosted by visual soft and it is a proprietory platform so we dont have full control of our site.
On-Page Optimization | | Russell-Gorilla
in comparison, 3 of our main competitors, two of which are way way bigger than us have 1.4m & 4.7m - another one still probably double or perhaps triple our size is @ 2.5m Should i worry?
Should i post my website url on here?
I would like to start working on canonical links on my site but not sure where to start, does moz pro have some sort of check or rating, i have no idea if even the basics mentioned in the tutorials have been done....
Russell0 -
Correct Hreflang & Canonical Tags for Multi-Regional Website English Language Only having URL Parameters
Dear friends, We have a multi-regional website in English language only having the country selector on the top of each page and it adds countrycode parameters on each url. Website is built in Magento 1.8 and having 1 store with multiple store views. There is no default store set in Magento as I discussed with developer. Content is same for all the countries and only currency is changed. In navigation there are urls without url parameters but when we change store from any page it add parameters in the url for same page hence there are total 7 URLs. 6 URLs for each page (with country parameters) and 1 master url (without parameters) and making content duplicity. We have implemented hreflang tags on each page with url parameters but for canonical we have implemented master page url as per navigation without url parameters Example on this page. I think this is correct for master page but we should use URL parameters in canonical tags for each counry url too and there should be only 1 canonical tag on each country page url. Currently all the country urls are having master page canoncial tag as per the example. Please correct me if I am wrong and **in this case what has to be done for master page? **as google is indexing the pages without parameters too. We are also using GEOIP redirection for each store with country IP detection and for rest of the countries which are not listed on the website we are redirecting to USA store. Earlier it was 301 but we changed it to 302. Hreflang tags are showing errors in SEMRush due to redirection but in GWT it's OK for some pages it's showing no return tags only. Should I use **x-default tags for hreflang and country selector only on home page like this or should I remove the redirection? **However some of the website like this using redirection but header check tool doesn't show the redirection for this and for our website it shows 302 redirection. Sorry for the long post but looking for your support, please.
International SEO | | spjain810 -
How to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries?
Dear all, what is the best way to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries? What must I add to my code of websites my .nl domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? What must I add to my code of websites my .be domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? Thanks in advance!
International SEO | | HMK-NL3 -
Backlinks that we didnt place, killing our SERP rank and PR
I am in need of advice regarding back links that we did not place, and which are hurting our search engine results. How and why they got there I cannot explain.But they have appeared recently, and are damaging our SERP ranks. For several years, I have been a member with SEOmoz, and we have done our search engine optimization in house. I am the owner of a personal injury law firm, which is a competitive field in search engine optimization. Recently, in the spring, we updated our website, added significant content (over 100 additional pages), we set up a better site structure, and we completed a significant back link campaign from white hat sources. As a result, we were the strongest law firm in search engine results in the state of Arizona, and the page rank from our home page went from a 4 to 6, and from our next highest level of pages, they went from a 3 to 6. This happened in 10 week period. Our search engine results were fantastic. We were getting a significant amount of business from out Places page and our Organic results. That has almost completely dried up. Approximately 6-8 weeks later, we started having some serious problems. Specifically, our search engine results decreased significantly, our page rank reduced from a six to a four. So we started using SEOmoz tools to see what the problem was, and when we created an open site Explorer report, there are approximately 1000 different links from very shady websites they are now linking to our home page. Some of these linking URLs prompt a download to video and other files. Other of the linking are simple on junk sites. Obviously, some other person placed these links. First and foremost I am interested in maintaining the integrity of our site, and if there were a way to remove these links, and protect against that in the future, that is what I want. Secondly, if there were a way to find out who did this, I would like to know that also. What options and/or actions should be taken. I am thinking that I may need to employ a professional/consultant. Will I have to transfer content to another domain? Your thought and help are appreciated. Thanks,
International SEO | | MFC0 -
Is duplicate content really an issue on different International Google engines?
i.e. Google.com v.s. Google.co.uk This relates to another question I have open on a similar issue. So if I open the same e-commerce site (virtually) on company.com and company.co.uk, does Google really view that as duplicate content? I would be inclined to think they have that figured out but I havent had much experience with international SEO...
International SEO | | BlinkWeb0 -
Internationally targetted subdomains and Duplicate content
A client has a site they'd like to translated into French, not for the french market but for french speaking countries. My research tells me the best way to implement this for this particular client is to create subfolders for each country. For ease of implementation I’ve decided against ccTLD’s and Sub Domains. So for example… I'll create www.website.com/mr/ for Mauritania and in GWT set this to target Mauritania. Excellent so far. But then I need to build another sub folder for Morocco. I'll then create www.website.com/ma/ for Morocco and in GWT set this to target Morocco. Now the content on these two sub folders will be exactly the same and I’m thinking about doing this for all French speaking African countries. It would be nice to use www.website.com/fr/ but in GWT you can only set one Target country. Duplicate content issues arise and my fear of perturbing the almighty Google becomes a possibility. My research indicates that I should simply canonical back to the page I want indexed. But I want them both to be indexed surely!? I therefore decided to share my situation with my fellow SEO’s to see if I’m being stupid or missing something simple both a distinct possibility!
International SEO | | eazytiger0