Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Pagination Changes
-
What with Google recently coming out and saying they're basically ignoring paginated pages, I'm considering the link structure of our new, sooner to launch ecommerce site (moving from an old site to a new one with identical URL structure less a few 404s).
Currently our new site shows 20 products per page but with this change by Google it means that any products on pages 2, 3 and so on will suffer because google treats it like an entirely separate page as opposed to an extension of the first.
The way I see it I have one option: Show every product in each category on page 1.
I have Lazy Load installed on our new website so it will only load the screen a user can see and as they scroll down it loads more products, but how will google interpret this? Will Google simply see all 50-300 products per category and give the site a bad page load score because it doesn't know the Lazy Load is in place? Or will it know and account for it?
Is there anything I'm missing?
-
It's likely that they will be valued a bit less but the effects shouldn't be drastic. Even if you just had one massive page with all products on the ones at the top would likely get more juice anyway
If it's a crazy big concern, think about a custom method to sort your products
-
Thank you very much for taking the time to respond so eloquently.
If all the products would be visible in the base, non-modified source code (right click page, then click "view source" - is the data there?) then there is a high likelihood that Google will see and crawl it.
I can confirm that each product does in fact appear in the source data, so as you say, Google will crawl it which is somewhat of a relief.
Does this then mean that regardless of which page the products appear on, Google will simply ignore this factor and treat each product the same regardless?
The thing I am trying to avoid is products on page 2, 3 and so on from being valued less.
-
This is a great, technical SEO query!
What you have to understand is that whilst Google 'can' crawl JS, they often don't. They don't do it for just anyone, and even then they don't do it all of the time. Google's main mission is to 'index the web' - on that account their index of the web's pages, whilst vast - is still far from complete
Crawling JavaScript necessitates the usage of a headless browser (if you were using Python to script such a thing, you'd be using the Selenium or Windmill modules). A browser must open (even if it does so invisibly) and 'run' the JavaScript, which creates more HTML - which can then be crawled only **AFTER **the script execution
On average this takes 10x longer than basic, non-modified source code scraping. Ask your self, would Google take a 10x efficiency hit on an incomplete mission - for 'everyone' on the web? The answer is no (I see evidence of this every day across many client accounts)
Let's answer your question. If all the products would be visible in the base, non-modified source code (right click page, then click "view source" - is the data there?) then there is a high likelihood that Google will see and crawl it
If the data (code) only exists with right click, inspect element - and not in "view source" - then the data only exists in the 'modified' source code (not the base-source). In that scenario, Google would be extremely unlikely to crawl it (or always crawl it). If it's a very important page on a very important site (Coca Cola, M&S, Barclays, Santander) then Google may go further
For most of us, the best possible solution is to 'get' the data we want crawled, into the non-modified source code. This can be achieved by using JS only for the visual changes (but not the structure) or by adopting SSR (Server Side Rendering)
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Migration - Pagination
Hi, We are migrating our website and an issue we are facing is how to handle paginated content in our categories. Our new website will have the same structure but with different urls. Should we 301 redirect all the paginated content (if crawled by Google) to the url of the main category? To put this into an example: Old urls: www.example.com/technology/tvs (main category of TVs & also page 1) ** www.example.com/technology/tvs?v=0&page=2 ** ( page 2 of TVs) New urls: **www.example.com/soundvision/tvs **(main category of TVs & also page 1) **www.example.com/soundvision/tvs?page=2 **(page 2 of tvs) Should we redirect all of the old TV urls (also the paginated) to www.example.com/soundvision/tvs ? The is no rel next, prev tag in our site and no canonicals. Also there is a view all products page in each category, BUT it doesn't contain all the products(max. is 100 per page - yes the view all page is also paginated). The same view all products page (paginated) will exist in the new website also. I checked google search console, and Google has decided to treat as canonical page the first page www.example.com/technology/tvs . Also, all the organic traffic of our categories goes to these pages (main category page - 1st page). I would appreciate any thoughts on this.
Intermediate & Advanced SEO | | HellasSITES0 -
Can you index a Google doc?
We have updated and added completely new content to our state pages. Our old state content is sitting in a our Google drive. Can I make these public to get them indexed and provide a link back to our state pages? In theory it sounds like a great link building strategy... TIA!
Intermediate & Advanced SEO | | LindsayE1 -
Is Google able to see child pages in our AJAX pagination?
We upgraded our site to a new platform the first week of August. The product listing pages have a canonical issue. Page 2 of the paginated series has a canonical pointing to page 1 of the series. Google lists this as a "mistake" and we're planning on implementing best practice (https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html) We want to implement rel=next,prev. The URLs are constructed using a hashtag and a string of query parameters. You'll notice that these parameters are ¶meter:value vs ¶meter=value. /products#facet:&productBeginIndex:0&orderBy:&pageView:grid&minPrice:&maxPrice:&pageSize:& None of the URLs are included in any indexed URLs because the canonical is the page URL without the AJAX parameters. So these results are expected. Screamingfrog only finds the product links on page 1 and doesn't move to page 2. The link to page 2 is AJAX. ScreamingFrog only crawls AJAX if its in Google's deprecated recommendations as far as I know. The "facet" parameter is noted in search console, but the example URLs are for an unrelated URL that uses the "?facet=" format. None of the other parameters have been added by Google to the console. Other unrelated parameters from the new site are in the console. When using the fetch as Google tool, Google ignores everything after the "#" and shows only the main URL. I tested to see if it was just pulling the canonical of the page for the test, but that was not the case. None of the "#facet" strings appear in the Moz crawl I don't think Google is reading the "productBeginIndex" to specify the start of a page 2 and so on. One thought is to add the parameter in search console, remove the canonical, and test one category to see how Google treats the pages. Making the URLs SEO friendly (/page2.../page3) is a heavy lift. Any ideas how to diagnose/solve this issue?
Intermediate & Advanced SEO | | Jason.Capshaw0 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Recent Algo Change
I was wondering if anybody can shed some light on any recent changes to the Google algorithm in Australia. A competitor, www.manwithavan.com.au has always been number 1 for the most competitive search term in our industry "removalists melbourne". However, in the last week, they have fallen out of the the SERPS and are now (according to MOZ) ranking outside the top 50. As far as l can tell, they have a really well optimized site with good structure, great text and updated content. They are very active within social media circles and have some really good external links. Can anybody tell me why they would have been hit so badly. The reason l ask is that i want to make sure we don't make the same mistake. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | RobSchofield1 -
Limit on Google Removal Tool?
I'm dealing with thousands of duplicate URL's caused by the CMS... So I am using some automation to get through them - What is the daily limit? weekly? monthly? Any ideas?? thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Will changing Google Places address hurt rankings?
I have a client transferring ownership of their service business (photo booth rental). The current listed address will change, so my main concern is preserving the rankings during the transition. Should I change the Google Local listing to a new physical address, or change it to "serve a surrounding area"? It seems best to set as "serving a surrounding area", but I know Google is really weird about making local listing changes. I've seen and heard about countless listings falling completely off the map after being updated. Any advice appreciated.
Intermediate & Advanced SEO | | Joes_Ideas0 -
So What On My Site Is Breaking The Google Guidelines?
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles" I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed) Contacted Google via the site reconsideration and got the general response... So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed. I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons. They are STILL saying my website is breaking the Google guidelines... mainly around links. Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't) Website in question: http://www.yourjigsawpuzzles.co.uk UPDATE: Just to let everyone know that after multiple reconsideration requests, this penalty has been removed. They stated it was a manual penalty. I tried removing numerous different types of links but they kept saying no, it's still breaking rules. It wasn't until I removed some website directory links that they removed this manual penalty. Thought it would be interesting for some of you guys.
Intermediate & Advanced SEO | | RichardTaylor0