Canonical URL's - Fixed but still negatively impacted
-
I recently noticed that our canonical url's were not set up correctly. The incorrect setup predates me but it could have been in place for close to a year, maybe a bit more. Each of the url's had a "sortby" parameter on all of them. I had our platform provider make the fix and now everything is as it should be.
I do see issues caused by this in Google Webmaster, for instance in the HTML suggestions it's telling me that pages have duplicate title tags when in fact this is the same page but with a variety of url parameters at the end of the url. To me this just highlights that there is a problem and we are being negatively impacted by the previous implementation.
My question is has anyone been in this situation? Is there any way to flush this out or push Google to relook at this? Or is this a sit and be patient situation.
I'm also slightly curious if Google will at some point look and see that the canonical urls were changed and then throw up a red flag even though they are finally the way they should be.
Any feedback is appreciated.
Thanks,
Dave -
In the past i have seen conanicals take up to 5-6 weeks. My only other advice is to monitor the amount of indexed queries you have in Google. If you know you started with 100+ and over the past three weeks it has dropped down to 50, then it is slowly taking affect (once again, using the site search). If you see the opposite and you notice no change, then perhaps the tag is still incorrect or some other issue?
I can't promise that all of the queried URLs will become un-indexed but the most important thing is the base page ranks the highest when searching.
-
Hi Kyle
Thanks for the response. That is a good point regarding the site:www.... search and in fact all of the results used the correct canonical url with the cached versions showing the same corrected format. The last time the sitemap was downloaded was yesterday so maybe my concern shouldn't be that great. What I'm seeing in webmaster tools does include some of the older content with the parameters but if the SERP's are showing updated versions then maybe that will be flushed out. I am just under the impression that if its in Google Webmaster then its part of Googles overall point of view of your site.
The canonical url updates have been fixed for about 3 weeks.
-
First i would check to see if the update you made to the pages have been recognized by Google. You can do this simply by doing a "site:www.domain.com" search, then view the cached page. If you find that it has not been recognized, you can always resubmit a new xml sitemap to your webmaster tools. In the past i have seen this help speed up the process.
How long ago did you make these updates?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One more question about rel=canonical
I'm still trying to wrap my head around rel=canonical and its importance. Thanks to the community, I've been able to understand most of it. Still, I have a couple of very specific questions: I share certain blog posts on the Huffington Post. Here's an example: http://www.huffingtonpost.ca/cedric-lizotte/munich-travel-guide_b_13438956.html - Of course I post these on my blog as well. Here: http://www.continentscondiments.com/things-munich-classics/ - Obviously the HuffPo has a huge DA, and I'll never match it. However the original post is mine, on my blog, and not on the HuffPo. They wont - obviously - add a rel=canonical just for me and for the sake of it, they have a million other things to do. QUESTION: Should I add a rel=canonical to my own site pointing to the post on the HuffPost? What would be the advantage? Should I just leave this alone? I share blog posts on Go4TravelBlog too. Example: http://www.go4travelblog.com/dallmayr-restaurant-munich/ - but, once again, the original post is on one of my blogs. In this case, it's on another blog of mine: http://www.thefinediningblog.com/dallmayr-restaurant-in-munich/ QUESTION: Well it's pretty much the same! Should I beg Go4TravelBlog to add a rel=canonical pointing to mine? If they refuse, what do I do? Would it be better to add a rel=canonical from my site to theirs, or do I fight it out and have a rel=canonical pointing to my own post? Why? Thanks a million for your help!
On-Page Optimization | | cedriklizotte0 -
Javascript(0) extension causing an excess of 404's
For some reason I am getting a duplicate version of my urls with /javascript(0) at the end. These are creating an abundance of 404 errors. I know I am not supposed to block JS files so what is the best way to block these? Ex: http://www.jasonfox.me/infographics/page/8/javascript(0) is a 404 http://www.jasonfox.me/infographics/page/8/ is not Thank you.
On-Page Optimization | | jasonfox.me0 -
Is there a limit to the number of duplicate pages pointing to a rel='canonical ' primary?
We have a situation on twiends where a number of our 'dead' user pages have generated links for us over the years. Our options are to 404 them, 301 them to the home page, or just serve back the home page with a canonical tag. We've been 404'ing them for years, but i understand that we lose all the link juice from doing this. Correct me if I'm wrong? Our next plan would be to 301 them to the home page. Probably the best solution but our concern is if a user page is only temporarily down (under review, etc) it could be permanently removed from the index, or at least cached for a very long time. A final plan is to just serve back the home page on the old URL, with a canonical tag pointing to the home page URL. This is quick, retains most of the link juice, and allows the URL to become active again in future. The problem is that there could be 100,000's of these. Q1) Is it a problem to have 100,000 URLs pointing to a primary with a rel=canonical tag? (Problem for Google?) Q2) How long does it take a canonical duplicate page to become unique in the index again if the tag is removed? Will google recrawl it and add it back into the index? Do we need to use WMT to speed this process up? Thanks
On-Page Optimization | | dsumter0 -
Google's mobile-friendly update. How significant is the impact for us?
Hi guys. Recently I got an email from Webmaster-tools saying our site is poorly optimised for mobile devices, and that it’s going to heavily affect rankings from April 21st. I’m worried to say the least. We literary cannot afford a hit on traffic at the moment 😞 We rank well for niche terms like ‘customised diary’ and ‘personalised diary’. So question... Because we rank well for these very specific searches will we still take a hit on rankings after the update? Won’t our high relevancy for those search terms be enough to keep us high in the results? Also, do you know if this change is specific to the users device? E.g) Someone on a mobile device will get mobile-friendly results, whilst users on a laptop will get different results altogether? I'm just trying to get a sense of how much this update will effect us. Any isights, suggestion, or thoughts would be greatly appreciated. Our site. Thanks in advance. This community is invaluable to us 🙂 Isaac - TOAD Diaries.
On-Page Optimization | | isaac6630 -
Putting content behind 'view more' buttons
Hi I can't find an upto date answer to this so was wondering what people's thoughts are. Does putting content behind 'view more' css buttons affect how Google see's and ranks the data. The content isn't put behind 'view more' to trick Google. In actual fact if you see the source of the data its all together, but its so that products appear higher up the page. Does anyone have insight into this. Thanks in advance
On-Page Optimization | | Andy-Halliday0 -
Is it redundant to include a redirect to my canonical domain (www) in my .htaccess file since I already have the correct rel="canonical" in my header?
I've been reading the benefits of each practice, but not found anyone mentioning whether it's really necessary to do both? Personally I try to stay clear of .htaccess rewrites unless it's absolutely necessary, since because I've read they can slow down a website.
On-Page Optimization | | HOPdigital0 -
Dates in URL's
I have an issue of duplicate content errors and duplicate page titles which is penalising my site. This has arisen because a number of URLs are suffixed by date(s) and have been spidered . In principle I do not want any url with a suffixed date to be spidered. Eg:- www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/06_07_13/13_07_13 http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/20_07_13/27_07_13 Only this URL should be spidered:- http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm I have over 10,000 of these duplicates and firstly wish to remove them on block from Google ( not one by one ) and secondly wish to amend my robots.txt file so the URL's are not spidered. I do not know the format for either. Can anyone help please.
On-Page Optimization | | carbisbayhols0 -
Can Sitemap Be Used to Manage Canonical URLs?
We have a duplicate content challenge that likely has contributed to us loosing SERPs especially for generic keywords such as "audiobook," "audiobooks," "audio book," and "audio books." Our duplicate content is on two levels. 1. The first level is at our web store, www.audiobooksonline.com. Audiobooks are sometimes published in abridged, unabridged, on compact discs, on MP3 CD by the same publisher. In this case we use the publisher description of the story for each "flavor" = duplicate content. Can we use our sitemap to identify only one "flavor" so that a spider doesn't index the others? 2. The second level is that most online merchants of the same publisher's audio book use the same description of the story = lots of duplicate content on the Web. In that we have 11,000+ audio book titles offered at our Web store, I expect Google sees us as having lots of duplicated (on the Web) content and devalues our site. Some of our competitors who rank very high for our generic keywords use the same publisher's description. Any suggestions on how we could make our individual audio book title pages unique will be greatly appreciated.
On-Page Optimization | | lbohen0