Do we have any risk or penalty for double canonicals?
-
Hi all,
We have double canonicals. From page A to page B to Page C. Will this be Okay for Google? Or definitely we need to make it A to C and B to C?
Thanks
-
Yes! I read the example backward. I'm with you! All pages should point to C.
-
Hi vtmoz.
I think Steve made a typo, saying to point all back to A.
My opinion here is:
- Avoid at any cost these canonical chains. They are messy to Google and it may get GoogleBot to reduce the importance of your pages.
- There is no risk of any know penalty. Google probably will not tell you in Search Console that you have a penalty for several canonicals.
- Point page A to C and page B to C.
Hope it helps.
Best Luck.
GR. -
I'd have them all pointing back to C so it's a little easier to manage long term. G just updated some of their docs related to canonical URL use cases with some great examples. From this page:
You can use a tag in the page header to indicate when a page is a duplicate of another page.
Suppose you want
https://example.com/dresses/green-dresses
to be the canonical URL, even though a variety of URLs can access this content. Indicate this URL as canonical with these steps:-
Mark all duplicate pages with a rel="canonical" link element. Add a element with the attribute
rel="canonical"
to the section of duplicate pages, pointing to the canonical page, like this one: -
If the canonical page has a mobile variant, add a
rel="alternate"
link to it, pointing to the mobile version of the page: -
Add any hreflang or other redirects appropriate for the page.
They don't touch on the chain of canonical URLs you suggest but I'd have them all pointing to C since it's a scalable change.
[edit: updated to match example in OP]
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page content is not very similar but topic is same: Will Google considers the rel canonical tags?
Hi Moz community, We have multiple pages from our own different sub-domains for same topics. These pages even rank in SERP for related keywords. Now we are planning to show only one of the pages in SERP. We cannot redirect unfortunately. We are planning to use rel canonical tags. But the page content is not same, only 20% is similar and 80% is different but the context is same. If we use rel canonicals, does Google accepts this? If not what should I do? Making header tags similar works? How Google responds if content is not matching? Just ignore or any negative score? Thanks
Algorithm Updates | | vtmoz0 -
Any risks involved in redirecting low quality Infringement website?
Hi all, Recently we have over taken one of the websites (with Trademark Infringement )who been using our domain name in their domain. That website got no traffic or backlinks. Is there any risk involved in redirecting that website to our website? Thanks
Algorithm Updates | | vtmoz0 -
Puzzling Penalty Question - Need Expert Help
I'm turning to the Moz Community because we're completely stumped. I actually work at a digital agency, our specialism being SEO. We've dealt with Google penalties before and have always found it fairly easy to identify the source the problem when someone comes to us with a sudden keyword/traffic drop. I'll briefly outline what we've experienced: We took on a client looking for SEO a few months ago. They had an OK site, with a small but high quality and natural link profile, but very little organic visibility. The client is an IT consultancy based in London, so there's a lot of competition for their keywords. All technical issues on the site were addressed, pages were carefully keyword targeted (obviously not in a spammy way) and on-site content, such as services pages, which were quite thin, were enriched with more user focused content. Interesting, shareable content was starting to be created and some basic outreach work had started. Things were starting to pick up. The site started showing and growing for some very relevant keywords in Google, a good range and at different levels (mostly sitting around page 3-4) depending on competition. Local keywords, particularly, were doing well, with a good number sitting on page 1-2. The keywords were starting to deliver a gentle stream of relevant traffic and user behaviour on-site looked good. Then, as of the 28th September 2015, it all went wrong. Our client's site virtually dropped from existence as far as Google was concerned. They literally lost all of their keywords. Our client even dropped hundreds of places for their own brand name. They also lost all rankings for super low competition, non-business terms they were ranking for. So, there's the problem. The keywords have not shown any sign of recovery at all yet and we're, understandably, panicking. The worst thing is that we can't identify what has caused this catastrophic drop. It looks like a Google penalty, but there's nothing we can find that would cause it. There are no messages or warnings in GWT. The link profile is small but high quality. When we started the content was a bit on the thin side, but this doesn't really look like a Panda penalty, and seems far too severe. The site is technically sound. There is no duplicate content issues or plaigarised content. The site is being indexed fine. Moz gives the site a spam score of 1 (our of 11 (i think that's right)). The site is on an ok server, which hasn't been blacklisted or anything. We've tried everything we can to identify a problem. And that's where you guys come in. Any ideas? Anyone seen anything similar around the same time? Unfortunately, we can't share our clients' site's name/URL, but feel free to ask any questions you want and we'll do our best to provide info.
Algorithm Updates | | MRSWebSolutions0 -
Canonical URLs being ignored?
Hi Guys, Has anybody noticed canonical URLs being ignored where they were previously obeyed? I have a site that is doing this at the moment and just wondered if this was being seen elsewhere and if anyone knows what the solution is? Thanks, Elias
Algorithm Updates | | A_Q0 -
VRL Parameters Question - Exclude? or use a Canonical Tag?
I'm trying to figure something out, as I just finished my "new look" to an old website. It uses a custom built shopping cart, and the system worked pretty well until about a year when ranking went down. My primary traffic used to come from top level Brand pages. Each brand gets sorted by the shopping cart and a Parameter extension is added... So customers can click Page 1 , Page 2 , Page 3 etc. So for example : http://www.xyz.com/brand.html , http://www.xyz.com/brand.html?page=1 , http://www.xyz.com/brand.html?page=2 and so on... The page= is dynamic, therefore the page title, meta's, etc are the same, however the products displayed are different. I don't want to exclude the parameter page= completely, as the products are different on each page and obviously I would want the products to be indexed. However, at the same time my concern is that have these parameters might be causing some confusion an hence why I noticed a drop in google rankings. I also want to note - with my market, its not needed to break these pages up to target more specific keywords. Maybe using something like this would be the appropriate measure?
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Is this a possible Google penalty scenario?
In January we were banned from Google due to duplicate websites because of a server configuration error by our previous webmaster. Around 100 of our previously inactive domain names were defaulted to the directory of our company website during a server migration, thus showing the exact same site 100 times... obviously Google was not game and banned us. At the end of February we were allowed back into the SERPS after fixing the issue and have since steadily regained long-tail keyword phrase rankings, but in Google are still missing our main keyword phrase. This keyword phrase brings in the bulk of our best traffic, so obviously it's an issue. We've been unable to get above position 21 for this keyword, but in Yahoo, Bing, and Yandex (Russian SE) we're positions 3, 3, and 7 respectively. It seems to me there has to be a penalty in effect, as this keyword gets between 10 and 100 times as much traffic in Google than any of the ones we're ranked for, what do you think? EDIT: I should mention in the 4-5 years prior to the banning we had been ranked between 15 and 4th in Google, 80% of the time on the first page.
Algorithm Updates | | ACann0