Best way to Handle Pagination?
-
At the moment I my blog is paginated like so:
/blogs > /blogs/page/2 > /blogs/page/3 etc
What are the benefits of paginating with dynamic URLs like here on SEOmoz with /blog?page=3
-
No I meant use /blogs/ as the first page and /blogs/pX for the next pages, X being the pagination number. These pages are valid and are not 301 of course.
BUT, /blogs/p1 is the same as /blogs/ so you should 301.
AND you must be aware of inexistent pages in the pagination (p10000 because you don't have 10000 pages of paginated results; /blogs/p01 or /blogs/p02 because these pages should not exist)
-
I wouldn't be comfortable 301'ing those pages like you said. I want my users to be able to navigate through earlier posts rather than just being redirected to the blog homepage. Perhaps you mean rel canonical redirect?
-
This won't make much difference, I usually use these urls though :
/blogs/
/blogs/p2
Remember to 301 /blogs/p1 to /blogs/ and to 404 pages with a page too big /blogs/p10000 or strange urls /blogs/p01
-
I was wondering the same thing and tried it in multiple ways. I got similar results SEO wise, I don't think it ultimately matters. What matters is how it looks to your users and a proper sitemap.
-
Thanks Dan. I would robots.txt the pages but I still want the links on those pages to flow PR - so I really need to noindex,follow them but WordPress is being difficult!
As for SEO differences, you're probably right. I think I just prefer the look for dynamic URLs than /dir/page
-
I don't think it makes a difference either way. One advantage to your current pagination is that if you want to block those pages you can robots.txt block the /page/ directory and that handles that. Not sure how SEOmoz would go about it with those dynamic URLs. From an SEO perspective I don't think it matters either way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dynamic Url best approach
Hi We are currently in the process of making changes to our travel site where by if someone does a search this information can be stored and also if the user needs to can take the URL and paste into their browser at find that search again. The url will be dynamic for every search, so in order to stop duplicate content I wanted ask what would be the best approach to create the URLS. ** An example of the URL is: ** package-search/holidays/hotelFilters/?depart=LGW&arrival=BJV&sdate=20150812&edate=20150819&adult=2&child=0&infant=0&fsearch=first&directf=false&nights=7&tsdate=&rooms=1&r1a=2&r1c=0&r1i=0&&dest=3&desid=1&rating=&htype=all&btype=all&filter=no&page=1 I wanted to know if people have previous experience in something like this and what would be the best option for SEO. Will we need to create the URL with a # ( As i read this stops google crawling after the #) Block the folder IN ROBOTS is there any other areas I should be aware of in order stop any duplicate content and 404 pages once the URL/HOLIDAY SEARCH is no longer valid. thanks E
Technical SEO | | Direct_Ram0 -
What is the best way to deal with https?
Currently, the site I am working on is using HTTPS throughout the website. The non-HTTPS pages are redirected through a 301 redirect to the HTTPS- this happens for all pages. Is this the best strategy going forward? if not, what changes would you suggest?
Technical SEO | | adarsh880 -
What's best practice for blog meta titles?
I have the option of placing meta titles on the actual blog, or on the blog category on my site. Should I have separate meta titles for each blog or bundle them under a category and try to drive traffic to the category? Can anyone help with best practice?
Technical SEO | | Lubeman0 -
How to handle lots of URL parameters
Howdy mozzers I'm hoping you can lend some advice. I'm dealing with a site now with loads of URL parameters. It's a vehicle dealership group which hosts its entire inventory from multiple locations on one page, sorted by parameters. Example inventory URL: www.dealership.com/car-inventory.asp?pa=&ns=10&so=m&sor=DESC&ma=&mod=&mt=&yr=&bs=&pr=&t=used&ln= Where pa (page no.); ns (number of vehicles shown); so (sort by condition); sor (sort order); ma (make); mod (model); yr (year); bs (body style); pr (price range); t (type - new, used, etc.); ln (location no.). As you can imagine this generates a gazillion URLs (or slightly less). Any thoughts on best canonicalization options? Thanks as always
Technical SEO | | jamesm5i0 -
What is the best URL designed for a product page?
Should a product page URL include the category name and subcategory name in it? Most ecommerce platforms it seems are designed to do have the category and sub-category names included in the URL followed by the product name. If that is the case and the same product is listed in more then 1 category and sub-category then will that product have 2 unique urls and as a result be treated as 2 different product pages by google? And then since it is the same product in two places on the site won't google treat those 2 pages as having duplicate content? SO is it best to not have the category and sub-category names in the URL of a product page? And lastly, is there a preferred character limit for a URL to be less than in size? Thanks!
Technical SEO | | gallreddy0 -
Is there an easier way from the server to prevent duplicate page content?
I know that using either 301 or 302 will fix the problem of duplicate page content. My question would be; is there an easier way of preventing duplicate page content when it's an issue with the URL. For example: URL: http://example.com URL: http://www.example.com My guess would be like it says here, that it's a setting issue with the server. If anyone has some pointers on how to prevent this from occurring, it would be greatly appreciated.
Technical SEO | | brianhughes2 -
Include pagination in sitemap.xml?
Curious on peoples thoughts around this. Since restructuring our site we have seen a massive uplift in pages indexed and organic traffic with our pagination. But we haven't yet included a sitemap.xml. It's an ancient site that never had one. Given that Google seems to be loving us right now, do we even need a sitemap.xml - aside from the analytical benefis in WM Tools? Would you include pagination URL's (don't worry, we have no duplicate content) in the sitemap.xml? Cheers.
Technical SEO | | sichristie0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0