Canonical pagination content
-
Hello
We have a large ecommerce site, as you are aware that ecommerce sites has canonical issues, I have read various sources on how best to practice canonical on ecommerce site but I am not sure yet..
My concert is pagination where I am on category product listing page.. the pagination will have all different product not same however the meta data will be same so should I make let's say page 2 or 3 to main category page or keep them as is to index those pages?
Another issue is using filters, where I am on any page and I filter by price or manufacturer basically the page will be same so here It seems issue of duplicate content, so should I canonical to category page only for those result types?
So basically If I let google crawl my pagination content and I only canonical those coming with filter search result that would be best practice? and would google webmaster parameter handling case would be helpful in this scenario ?
Please feel free to ask in case you have any queries
regards
Carl -
Google just announced some tags to help support pagination better. They say if you have a view all option that doesn't take too long to load, searchers generally prefer that, so you can rel=canonical to that page from your series pages. However, if you don't have a view all page, then you can put these nifty rel="next" and rel="prev" tags in to let Google know your page has pagination, and where the next and previous pages are.
View all: http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html
next/prev: http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html
-
I checked your site, and don't know whether you already changed it or not, but it looks pretty good. I have dealt with much more hardcore issues, meaning where you have tons of products in each category, several filters which can be freely permutated, and in the meantime you were able to paginate as well. There were a lot of canonical issues, so your case is an easy ride, believe me.
Here are a few tips, and I reason why I suggest them:
1) cutting back your navigation on deeper pages
I just quickly checked how many links is included in your site-wide navigation with Google Spreadsheet:
=ImportXML("http://www.cnmonline.co.uk/Dehumidifiers-c-778.html","//h6/a/@href")
And it got back 142 links. Whoa, thats a lot. And that many links are included in all of your pages, and the navigation is placed BEFORE your content. I had this very same issue with a client, they were hesitating to change the navigation, but eventually it helped them, a lot.
The suggested solution:
- wipe out the drop menu links from deeper pages
- only link to the big categories: "Air Treatment", "Bathroom", ... "Cleaning Products"
- in the category you are in, you can link to subcategories (without any javascript/css drop menu, just simply list them beneath the main category with a different background than darkblue), for example if you are in the Bathroom category, your left navigation will look like:
- Air Treatment
- Bathroom
- Electric Showers
- Mirror Demister
- Bathroom Heaters
- Heated Towel Rails
- Catering Equipment
- ...
- Cleaning Products
So this way you don't have to change a lot in your navigation, and it will make your interlinking more consistent. Furthermore if a user wants to find an another category, there is the search box, the main categories, and the breadcrumb. Which leads to the next suggestion:
2) Make the breadcrumb look like a breadcrumb, not like a tab.
This is just a personal feeling, but now it looks like a tab rather than a breadcrumb. These add up resulting in my feeling: "item1 | item 2 | item3" without underlining the links (so they not looking and feeling like links), and not beginning at the left side of the site, instead next to the left navigation.
Suggested solution:
- move your breadcrumb to the very left side of your site, above your navigation box, you can position it to start from the left side as your navigation box starts (it looks like 15px padding from the left side of the white background)
- the text can be smaller, but make the links underlined, to look like links
- change the pipeline ("|") character with a greater than character (">"), that's much more like a breadcrumb
3) make your pagination link follow, and the pagination pages meta "follow,noindex"
Now at the moment you have nofollowed your pagination links, which results in lower indexation between your product pages than it would possible.
Eg:
- this is cached: http://webcache.googleusercontent.com/search?q=cache:www.cnmonline.co.uk/Bathroom-Products-c-2278.html&hl=en&strip=1
- but the 2nd page isn't: http://webcache.googleusercontent.com/search?q=cache%3Awww.cnmonline.co.uk%2FBathroom-Products-c-2278-p-2.html
- and whats even worse, but not surprising, this item on the second page isn't indexed: http://webcache.googleusercontent.com/search?q=cache%3Awww.cnmonline.co.uk%2FSavoy-Shawl-Collar-Bath-Robe-Box-of-5-pr-36295.html
Suggested solution:
- let the google bot follow your pagination links, remove the rel nofollow attribute from the links
- make the pagination pages meta robots "follow,noindex"
This change means the google bot can follow your product pages, but won't index those paginated pages. This is awesome, since you don't want to hassle with unique title, description, and the pagination pages are just lists, they don't give any additional value or any reason to be indexed.
Of course if you had pagination issue with reviews, then it would be a whole different story, because then each paginated pages would be valuable, since they are listing valuable user generated content, and not just essentially linking to product pages. So in that case, you might create unique titles and description at least by adding "page X".
4) Your filters aren't causing duplication / canonical issue, since they work on an ajax basis, and they don't create any new url.
So here you shouldn't change anything, but I guess this don't surprise you. You can always check this, by using 'cache:' in google and selecting text-only version, for example: "cache:http://www.cnmonline.co.uk/Bathroom-Heaters-c-2320.html", click text-only version, and you will see that Price Range and Manufacturer have no links which google could follow, so no canonical problem.
Hope this helps.
-
Is it best method to get canonical url redirect with paging to view all pages including all the urls coming with price and sorting filters? any other members would like to share their opinions?
regards
Carl
-
View all! Of course... how did I not think of that before? Thank you.
-
Concerning Pagination,
I would create a "view all" where all the products are listed under this category. then i add rel canonical linking to the "View All " page.
its can help you with your first question and for the issue using filters.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content and canonicalization confusion
Hello, http://bit.ly/1b48Lmp and http://bit.ly/1BuJkUR pages have same content and their canonical refers to the page itself. Yet, they rank in search engines. Is it because they have been targeted to different geographical locations? If so, still the content is same. Please help me clear this confusion. Regards
Technical SEO | | IM_Learner0 -
Duplicate content and rel canonicals?
Hi. I have a question relating to 2 sites that I manage with regards to duplicate content. These are 2 separate companies but the content is off a data base from the one(in other words the same). In terms of the rel canonical, how would we do this so that google does not penalise either site but can also have the content to crawl for both or is this just a dream?
Technical SEO | | ProsperoDigital0 -
Why has Google stopped indexing my content?
Mystery of the day! Back on December 28th, there was a 404 on the sitemap for my website. This lasted 2 days before I noticed and fixed. Since then, Google has not indexed my content. However, the majority of content prior to that date still shows up in the index. The website is http://www.indieshuffle.com/. Clues: Google reports no current issues in Webmaster tools Two reconsideration requests have returned "no manual action taken" When new posts are detected as "submitted" in the sitemap, they take 2-3 days to "index" Once "indexed," they cannot be found in search results unless I include url:indieshuffle.com The sitelinks that used to pop up under a basic search for "Indie Shuffle" are now gone I am using Yoast's SEO tool for Wordpress (and have been for years) Before December 28th, I was doing 90k impressions / 4.5k clicks After December 28th, I'm now doing 8k impressions / 1.3k clicks Ultimately, I'm at a loss for a possible explanation. Running an SEOMoz audit comes up with warnings about rel=canonical and a few broken links (which I've fixed in reaction to the report). I know these things often correct themselves, but two months have passed now, and it continues to get progressively worse. Thanks, Jason
Technical SEO | | indieshuffle0 -
404-like content
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages. the following page is an example: http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas? To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping: All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter. The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page. Like I said, the response codes are setup correctly, as far as Firebug can tell me. any thoughts would be most appreciated.
Technical SEO | | eseyo20 -
At what point is the canonical tag crawled
Do search engines (specifically Google) crawl the url in the canonical tag as it loads or do they load the whole page before crawling it? Thanks,
Technical SEO | | ao.com0 -
Rel Canonical - Wordpress
How do you fix the rel canonical issue on a wordpress site? Is there a quick fix? I have a few notices on my site and am a little confused. Thanks, Jared
Technical SEO | | SaborStyle0 -
Canonical on ecommerce pages
I have seen some competitors using the nofollow tag as well as canonical on all refinements and sorts on their ecommerce pages. Example being if you went to their hard drive category page and refined by 500gb hard drives then that page would have a canonical element to send it back to hard drives page without the refinement. I see how this could be good for control indexation and the amount pages Google crawls, but do you see problems in using the canonical tag this way? Also I have seen competitors have category page descriptions (describing what that type of product is) on all pagenation and refinements (the exact same block of text on all of the pages). Would this be a duplicate content problem or is it not that big of a deal since the content is only on their site so they are only competiting with themselves. Thanks for your help
Technical SEO | | Gordian0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0