Best Practices for Pagination on E-commerce Site
-
One of my e-commerce clients has a script enabled on their category pages that allows more products to automatically be displayed as you scroll down. They use this instead of page 1, 2, and a view all.
I'm trying to decide if I want to insist that they change back to the traditional method of multiple pages with a view all button, and then implement rel="next", rel="prev", etc.
I think the current auto method is disorienting for the user, but I can't figure out if it's the same for the spiders.
Does anyone have any experience with this, or thoughts?
Thanks!
-
Thanks Everett - I was coming to that conclusion, but really just needed some back-up. I appreciate your response!
Emily
-
Smallbox,
This reply assumes that the "auto scroll feature" you mention is what is also known as a form of "progressive loading", though progressive loading can also be used to load unseen files like scripts and tracking pixels. On-scroll progressive loading of page content is a standard that more and more websites are using, including Twitter, and Google+ and so web users are getting very used to it. Most don't even notice that it's happening. Thus, I don't think it's a problem.
However, if you're talking hundreds or thousands of products, as opposed to dozens, you may want to provide an alternate viewing option, such as pagination. In that case, you would probably use the View All Canonical method as discussed here, providing your "view all" pages load all of the above-the-fold content within about 3-4 seconds or less: http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html .
-
Thanks! I appreciate your response and understand what you're saying.
In this case, all of the products on the page are relevant and in the same category. And I get that it's best to have only one actual page...I'm just not sure about this auto-scroll feature.
I think I will leave it alone for now, and just keep an eye on our metrics/analytics.
Thanks!
-
Its easier for the spiders because its all on one page. If the pages are very long they can be divided up by sub segments and all of our existing ecommerce sites will use a search tab at the top which will allow them to fast forward to that page.
Couple of other important items.....Remember that for a customer to scroll through one long page, if the content is interesting, doesnt require them to click on another page and start over....this can be valuable with the proper call to actions.
It can be a problem if all the content is on one page for "relevancy" if it combines multiple products that are unrelated.....similar to the reason why you would create smaller adgroups within adwords.....if you have a page that is about building supplies and on that same page you have hammers and gas power equipment, you wont get the same "relevancy" value as having separate pages with separate content and separate sitemap designation (one page hand tools-hammers) (one page gas powered equipment-construction heaters, etc.)
Hope this is helpful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Pagination dilemma
We are about to migrate an ecommerce site to a whole different platform and we are facing a pagination dilemma. The new platform has a view all page for each category where all the individual pages point with a rel canonical to the view all page. What would be best to to as for the redirect. should we do the 301 redirect to the view all page ? Considering that the view all page takes quite a time to load, wouldn’t it be good to drop the view all an implement the rel-next, rel-prev and point the 301 redirect to the first page. If we do the view all page, wouldn't be a problem if all the natural backlinks that we'll get will not be for the page view all page, but for the page1, page2 etc?
Intermediate & Advanced SEO | | lvt0 -
Best-practice URL structures with multiple filter combinations
Hello, We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types. The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean: www.domain.com/project?topic=firstTopic
Intermediate & Advanced SEO | | digitalcrc
or
www.domain.com/project?object=typeOne The problems arise when people select multiple topics, potentially across two different filter types: www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns: A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated. Thanks!0 -
Best practice with duplicate content. Cd
Our website has recently been updated, now it seems that all of our products pages look like this cdnorigin.companyname.com/catagory/product Google is showing these pages within the search. rather then companyname.com/catagory/product Each product page does have a canaonacal tag on that points to the cdnorigin page. Is this best practice? i dont think that cdnorigin.companyname etc looks very goon in the search. is there any reason why my designer would set the canonical tags up this way?
Intermediate & Advanced SEO | | Alexogilvie0 -
Best Practice for ALT tags of flags to interlink multinational site
For a partial keyword match domain name what would you recommend as ALT tag to internlink country domains (different CCTLD)? Option 1)
Intermediate & Advanced SEO | | lcourse
DOMAIN.com DOMAIN.de DOMAIN.co.uk => I am a bit concerned about this option in terms of potential penalty for keywords in ALT (since partial match domains) Option 2)
UK
DE
FR ... Option 3)
English UK
Deutsch Deutschland
Deutsch Österreich
Francais France => concerned here about mixing lots of languages in ALT tags in each page, which may confuse google language detection.0 -
What is the best: one big site or several small ones?
Sometimes I've felt that my choice was incorrect. I know taht with a big amount of content on the same subject a bigger one is better: it’s easier to maintain, and a big number of pages is good for Google ranking, but when you have many small sites, each one focusing on a totally different subject, you can crosslink them and this will improve your rankings. Furthermore, many small sites allow you to focus in specific niches and help you rank better for different keywords. So, what is the best choice?
Intermediate & Advanced SEO | | sergio_redondo0 -
E-commerce Site - Filter Pages
Hi, We have a client who has a fairly large e-commerce site that went live quite recently. The site is near enough fully indexed by Google, but one thing I've noticed is that filtered search results pages are being indexed, all with duplicate page titles. Obviously this is an issue that needs to be looked at ASAP. My questions is this - would we be better tweaking site settings so that page titles are constructed from the filters (brand/price/size) and therefore unique (and useful for searchers who are after a specific brand or size of a given item). Or should we rel=canonical the filtered pages so that they are eventually dropped from the index (the safer of the two options)? Thanks in advance for your help!
Intermediate & Advanced SEO | | jasarrow0