Please help me articulate why broken pagination is bad for SEO...
-
Hi fellow Mozzers.
I am in need of assistance. Pagination is and has been broken on the Website for which I do SEO in-house...and it's been broken for years.
Here is an example: http://www.ccisolutions.com/StoreFront/category/audio-technica
This category has 122 products, broken down to display 24 at a time across paginated results. However, you will notice that once you enter pagination, all of the URLs become this: http://www.ccisolutions.com/StoreFront/IAFDispatcher
Even if you hit "Previous" or "Next" or your browser back button, the URL stays: http://www.ccisolutions.com/StoreFront/IAFDispatcher
I have tried to explain to stakeholders that this is a lost opportunity. That if a user or Google were to find that a particular paginated result contained a unique combination of products that might be more relevant to a searcher's search than the main page in the series, Google couldn't send the searcher to that page because it didn't have a unique URL. In addition, this non-unique URL most likely is bottle-necking the flow of page authority internally because it isn't unique. This is not to mention that 38% of our traffic in Google Analytics is being reported as coming from this page...a problem because this page could be one of several hundred on the site and we have no idea which one a visitor was actually looking at.
How do I articulate the magnitude of this problem for SEO? Is there a way I can easily put it in dollars and cents for a business person who really thinks SEOs are a bunch of snake oil salesmen in the first place?
Does anyone have any before and after case studies or quantifiable data that they would be willing to share with me (even privately) that can help me articulate better how important it is to address this problem. Even more, what can we hope to get out of fixing it? More traffic, more revenue, higher conversions?
Can anyone help me go to the mat with a solid argument as to why pagination should be addressed?
-
Thanks so much Gianluca for this thoughtful and valuable advice.
Yes, page load speed is definitely something that's been a concern. This is why we went back to 24 products displayed per page instead of 50 a few months ago. However, since then we've made some significant improvements in page load times and we think we can probably go up to 100 products per page and still be fairly fast. We will have to test.
On the up side, we only have 7 categories with more than 100 products, and only 24 with more than 50. The biggest problem we have effecting speed isn't so much the images. It's the fact that the website does real-time pricing calls on every product to ou business back end every time the page loads. This may be a sticking point.
I have also thought about the canonical tag problem. Of course, it's a problem now too, but if the "View All" page just ends up getting that generic URL and no proper canonical tag...then we really are back to square one.
The possibility of no-indexing all of the categories that are related to paginated series is something that crossed my mind yesterday, so it's interesting that you mentioned that. While it would solve certain issues, wouldn't this be a problem in terms of having valuable content in Google? Granted, some of our category pages are purely there for navigation purposes, in which case, I suppose there's no harm in no-indexing them. However, with the roll-out of Hummingbird I began looking at our category pages as valuable opportunities for "topics" pages that could act as a hub for visitors searching for products or information around specific uses or brands.
Wouldn't there be a significant risk in losing valuable market share for key terms by removing so many category pages from Google's index?
If I am understanding your last suggestion you are saying to have the page default to "View All" and noindex everything else...You are right, not a great scenario, but you are also right in that this may be the only solution given management's steadfast stance on not wanting to pay to fix it.
Lot's to think about, but your comment has been extremely helpful. Thanks again!
-
Dana,
just few tips about the view all option.
While it surely is the best solution, even when a real pagination exists, you should always remember few things:
- a view all list with tens of snippets (photo + text + link) can be like a block of reinforced concrete for the PageSpeed of your site: imagine those listings with 100+ products.
In that case using a view all can be not the correct solution, because googlebot won't ever be able to go through all the code and give up following all the URLs present in the view all page.
-
in fact, the ideal should be having a view all page uploading completely within 4 seconds
-
for that reason, if the only solution you have is having a view all page, then you should seriously thinking in implementing the lazy loading for the images, so that the written content (links included) will have priority in rendering and Google will see them all, and images are uploaded only when needed (i.e.: when the users, scrolling down, arrives to the image that must appear).
Then, there's doubt - a big one: if the paginated list always have this URL http://www.ccisolutions.com/StoreFront/IAFDispatcher, how can you put as its canonical the view all of http://www.ccisolutions.com/StoreFront/category/audio-adapters-audio-connectors when it should have also as canonical http://www.ccisolutions.com/StoreFront/category/audio-technica?
Maybe the only solution you have is this:
-
forcing that the view all URL is the default one;
-
all the paginated pages (also the the first page) are noindex
Not really a wonderful solution, but - from what I understood about the stubborness of your bosses - the only one. But one that must be executed properly in order to avoid worse issues.
-
....is like hiring an astronaut, handing them a box of toothpicks and some gunpowder and saying you expect them to land on the moon
ha ha ha... that is really funny.
Thanks for the laugh.
-
Thanks so much EGOL. I always love your candor.
Believe me, when I went home last night to ponder solutions to this problem, everything you mentioned crossed my mind. It was a thoroughly frustrating conversation to have. It simply amazes me that Google can tell the world very clearly all the things that will help their sites do better in the SERPs, yet people continue to ignore all of that advice, do what they want (or whatever is "easy" or cheap), and then whine about why their sites aren't doing well.
Making the commitment to hire an in-house SEO without equipping them with good tools and refusing to take their advice is like hiring an astronaut, handing them a box of toothpicks and some gunpowder and saying you expect them to land on the moon.
-
Thanks so much Andy. Agreed on all points. I think I have convinced the powers that be that at the very least we should add a "View All" option. This would give both end-users and Google a useful means to access all of the products in a category at once, without having to resort to pagination if they didn't want to. It is something we can add fairly easily and at little to no cost. Since only 8 of our category pages have more than 100 products, and none go higher than 200, this seems like a very reasonable compromise, at least for now.
I very much appreciate you taking the time to respond
It was a frustrating day and a frustrating conversation to have to have.
-
I don't have an answer for you... but I will say that it would really bother me that I would have to jump through hoops with a pogo stick to get stakeholders to want to address this.
I'll skip my rant and get right to the analysis.....
What's going on? Are these stakeholders: A) dumb? B) lazy? C) short of resources? D) frying bigger fish?
If it is A or B then I am probably looking for another job before the company goes bankrupt.
If it is D then I might decide if I should resign and go into competition with them to cash in on the bonanza.
If it is C then you have a dilema that could involve going to the stakeholders boss, other creative solutions or looking for a new job.
Really, you should not have to ask this question.
-
Hi Dana,
I can certainly understand your problem, and whilst I have no data to give you, you should certainly be looking at this not only as a lost opportunity from and SEO perspective, but also as the inability to report back just how well the site is converting traffic. Without this data, no site can see where changes can be made and where improvements will result to an increase in revenue.
I would also look at the fact that anything that is broken on a site might not be having an observable negative issue right now, but what happens with the next algorithm update? Will something be spotted at some point? Do you want to wait for Google to penalise the site before realising it should have been corrected?
Also, does it make for a poor user experience? If someone comes to the site and then bookmarks of of these pages, how are they going to get back again? Are they then likely to just navigate away because they didn't land where they intended.
I am sure there will be a loss in revenue from this - quantifying it will be difficult for an outsider though. There is no doubt that this should be resolved, and I would say ASAP as well.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Login to see more (some text hidden by CSS height and jquery) will it ruin SEO?
Hey SEO masters! I have a website that is smashing it for SEO in Australia. In an effort to increase a user base I want to make it so only logged in users can see all the content. So today, I launched a new feature hiding content using CSS 'height:' property. The content is obviously still there and if you were a developer you could easily 'inspect element' and remove that CSS style to see everything... There are a few other tweaks i made for logged out users, but that only affects some json. Question: will this affect my SEO rankings? Here is a direct example: https://www.fishingspots.com.au/s/perth if you sign up, there is about 1400words of content.
Web Design | | thinkLukeSEO0 -
Does the blog widget with latest blog-posts at homepage helps in SEO?
Hi all, We are planning to add a widget at our website homepage which displays recent blog-posts with dates. Google favours new and latest content. So will these consistent new posts help in improving website ranking? Thanks
Web Design | | vtmoz0 -
How Progressive Enhancement Will be Helpful for SEO?
We have bundle of webpages where we load the content dynamically with the help of Ajax. Since we, need to implement Ajax crawl scheme for making Google to read those Ajax dynamic content we planned to go with hashbang URL's (!#) by creating HTMl snapshots. But last week Google withdrawn their support on crawling the Ajax crawling scheme we are planning to go with progressive enhancement approach as stated by Google in a press release. So, I just want to know what is meant by progressive enhancement and how we can implement in the case of webpages where we load the content dynamically with the help of Ajax? Please advice me on this.
Web Design | | Prabhu.Sundar1 -
Will interlinking using dynamic parameters in url help us in increasing our rankings
Hi, Will interlinking our internal pages using dynamic parameters(like abc.com/property-in-noida?source=footer) help us in increasing our rankings for linked pages OR we should use static urls for interlinking Regards
Web Design | | vivekrathore0 -
Is this CSS solution to faceted navigation a problem for SEO?
Hi guys. Take a look at the navigation on this page from our DEV site: http://wwwdev.ccisolutions.com/StoreFront/category/handheld-microphones While the CSS "trick" implemented by our IT Director does allow a visitor to sort products based on more than one criteria, my gut instinct says this is very bad for SEO. Here are the immediate issues I see: The URL doesn't change as the filter criteria changes. At the very least this is a lost opportunity for ranking on longer tail terms. I also think it could make us vulnerable to a Panda penalty because many of the combinations produce zero results, so returning a page without content, under the original URL. This could not only create hundreds of pages with no content, there would be duplicates of those zero content pages as well. Usability - The "Sort by" option in the drop down (upper right of the page) doesn't work in conjunction with the left Nav filters. In other words if you filter down to 5 items and then try to arrange them by price high to low, the "Sort" will take precedence, remove the filter and serve up a result that is all products in that category sorted high to low (and the filter options completely disapper), AND the URL changes to this: http://wwwdev.ccisolutions.com/StoreFront/category/IAFDispatcher regardless of what sort was chosen...(this is a whole separate problem, I realize and not specifically what I'm trying to address here). Aside from these two big problems, are there any other issues you see that arise out of trying to use CSS to create product filters in this way? I am trying to build a case for why I believe it should not be implemented this way. Conversely, if you see this as a possible implementation that could work if tweaked a bit, and advice you are willing to share would be greatly appreciated. Thanks! Thank you to Travis for pointing out the the link wasn't accessible. For anyone willing to take a closer look we can unblock the URL based on your IP address. If you'd be kind enough to send me your IP via private message I can have my IT director unblock it so you can view the page. Thanks!
Web Design | | danatanseo0 -
Booking Engine SEO Question
Hello, I am working on a travel site-mostly content based, but for the deals section of the site, we were thinking of being powered by expedia...if we go with a booking engine (Expedia) will that hurt us with regards to SEO? If Google is looking for content and not another booking engine how can we overcome this? Do you think this approach is positive? any thoughts or advice on this, thanks so much.
Web Design | | lanigreg0 -
How do I gain full SEO value from individual property pages?
A client of ours has a vacation rental business with rental locations all over the country. Their old sites were a messy assembly of black hat, broken links and htaccess files that were used over and over on each site. We are redoing everything for them, in one site, with multiple subdirectories for individual locations, like Aspen, Fort Meyers, etc. Anyhow, I'm putting together the SEO plan for the site and I have a problem. The individual rental properties have great SEO value (lots of text, indexable pictures, can create google/bing location pages), and are great for linking in social media (Look at this wonderful property, rental price just reduced!). However, I don't want individual properties, which will have very similar keywords, links, descriptions, etc, competing with each other when indexed. Truth be told, I don't really want search engines linking directly to the individual property pages at all. The intended browsing experience should allow a user to "narrow down" exactly what they're seeking using the site until the perfect rental appears. What I want is for searchers to be directed to the property listing index that most closely matches what they're seeking (Ft. Meyers Rental Condos or Breckenridge Rental Homes), and then allow them to narrow it down from there. This is ideal for the users, because it allows them to see all available properties that match what they want, and ideal for the customer, because it applies dozens of pages of SEO mojo to a single index, rather than dozens of pages. So I can't "noindex" or "nofollow", because I want all that good SEO mojo. I can't REL=CANONICAL, because the property pages aren't similar enough to the index. I can't 301 Redirect because I want the users to be able to see the property pages at some point. I'm stymied.
Web Design | | SpokeHQ0 -
Wordpress/ Insert Tables/ SEO
I'm using Wordpress to create websites and blogs. I have limited (non-existent) HTML Coding knowledge. I'm looking to insert tables within my pages with information. Inside of these tables I want certain names to link to another page with more specific information about that name. I'm using a plugin called "WP Tables Reloaded" it simple helps you to create aesthetically pleasing tables without needing to know HTML Code or CSS. The issue is... when you create this table and insert it to the post, the only thing that shows on the sites back-end page is the table I.D. and the only thing that shows in the HTML is the tables I.D. It looks like this... [table id=2 /] I don't think search engines will be able to crawl this table, thus I won't be receiving any credit for the links being used within the table. Am I right about this?
Web Design | | AndySolo0