Pagination and links per page issue.
-
Hi all,
I have a listings based website that just doesn't seem to want to pass rank to the inner pages.
See here for an example:
http://www.business4sale.co.uk/Buy/Hotels-For-Sale-in-the-UK
I know that there are far too many links on this page and I am working on reducing the number by altering my grid classes to output fewer links.
The page also displays a number of links to other page numbers for these results. My script adds the string " - Page2" to the end of the title, description and URL when the user clicks on page two of these results.
My question is:
Would an excessive amount(200+) of links on a page result in less PR being passed to this page(looking spammy)?
And would using rel canonical on page numbers greater than 1 result in better trust/ranking?
Thanks in advance.
-
I believe the number 100 is the limit of links in a page. and yes the more links the less pr being passed to each page.
But 100 links on the home page means means you can have 100 child pages with 100 links on each means 10,000 links only 2 clicks from home page.
as for re=canonical, is page 2 unique? then yes just as you would for any other page.
I assume you are aware of the flat link stucture, if not I think this page though old is a must.
http://www.webworkshop.net/pagerank.htmlIts a long read, but very imformative
-
Rel canonical doesn't tell engines not to crawl the page (the mata tag nofollow does that), but rather just tells the engine not to index the page in place of the 1st page of your pager results. This helps reduce duplicate content penalties and consolidates your PA onto a single URL. I could be wrong, but I imagine you would also prefer organic users to go to the first page of results rather than, say, page 6.
The result is that engines will still crawl your pages and find your listings (and index those), but only index the first page of your listing page.
I hope that helps to clarify things!
Andrew
-
I'm sorry, what I meant was you should make the pagination pages canonical themselves, like for Page 2...
-
But would rel canonical make listing on page 2 and above unindexible?
Listings are added chronologically on my site and I still want crawlers to be able to reach adverts created years ago, these listing a could be on page 100, surely rel canonical tells engines not to crawl the page as it is not the canonical version?
-
200 links on a page isn't that bad. Once you get to 250+ I would rethink the architecture.
Yes, you should use rel canonical on your pagination pages.
A good way to pass ranking between deep pages like this is to have a section at the bottom that offers similar listings in the area. This way you are giving the bots multiple ways to find each listing, rather than just from one page/category. Do it like this - http://www.estatesgazette.com/propertylink/advert/kensingtonrooms_hotel-_131_137_cromwell_road_london_sw7_4du-3264453.htm. They have a "More Properties from this Advertiser" section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google webcache of product page redirects back to product page
Hi all– I've legitimately never seen this before, in any circumstance. I just went to check the google webcache of a product page on our site (was just grabbing the last indexation date) and was immediately redirected away from google's cached version BACK to the site's standard product page. I ran a status check on the product page itself and it was 200, then ran a status check on the webcache version and sure enough, it registered as redirected. It looks like this is happening for ALL indexed product pages across the site (several thousand), and though organic traffic has not been affected it is starting to worry me a little bit. Has anyone ever encountered this situation before? Why would a google webcache possibly have any reason to redirect? Is there anything to be done on our side? Thanks as always for the help and opinions, y'all!
Intermediate & Advanced SEO | | TukTown1 -
Preserving link equity from old pages
Hi Moz Community, We have a lot of old pages built with Dreamweaver a long time ago (2003-2010) which sit outside our current content management system. As you'd expect they are causing a lot of trouble with SEO (Non-responsive, duplicate titles and various other issues). However, some of these older pages have very good backlinks. We were wondering what is the best way to get rid of the old pages without losing link equity? In an ideal world we would want to bring over all these old pages to our CMS, but this isn't possible due to the amount of pages (~20,000 pages) and cost involved. One option is obviously to bulk 301 redirect all these old pages to our homepage, but from what we understand that may not lead to the link equity being passed down optimally by Google (or none being passed at all). Another option we can think of would be to bring over the old articles with the highest value links onto the current CMS and 301 redirect the rest to the homepage. Any advice/thoughts will be greatly appreciated. Thumbs up! Thanks,
Intermediate & Advanced SEO | | 3gcouk0 -
Pagination and View All Pages Question. We currently don't have a canonical tag pointing to View all as I don't believe it's a good user experience so how best we deal with this.
Hello All, I have an eCommerce site and have implemented the use rel="prev" and rel="next" for Page Pagination. However, we also have a View All which shows all the products but we currently don't have a canonical tag pointing to this as I don't believe showing the user a page with shed loads of products on it is actually a good user experience so we havent done anything with this page. I have a sample url from one of our categories which may help - http://goo.gl/9LPDOZ This is obviously causing me duplication issues as well . Also , the main category pages has historically been the pages which ranks better as opposed to Page 2, Page 3 etc etc. I am wondering what I should do about the View All Page and has anyone else had this same issue and how did they deal with it. Do we just get rid of the View All even though Google says it prefers you to have it ? I also want to concentrate my link juice on the main category pages as opposed being diluted between all my paginated pages ? - Does anyone have any tips on how to best do this and have you seen any ranking improvement from this ? Any ideas greatly appreciated. thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
What are the SEO issues we should consider on a plug in that creates a custom home page based on zip code or GPS location.
We are developing a plug in the changes the home page relative to a users location or zip code. We believe this will provide users with a more personalized experience. We are concerned about how this might affect SEO. We are also wondering if we should partner with one of the SEO ply in developers. We were thinking about Yoast. Is there another partner that might be better? I would appreciate any feedback people can give.
Intermediate & Advanced SEO | | Ron_McCabe0 -
Is Sitemap Issue Causing Duplicate Content & Unindexed Pages on Google?
On July 10th my site was migrated from Drupal to Google. The site contains approximately 400 pages. 301 permanent redirects were used. The site contains maybe 50 pages of new content. Many of the new pages have not been indexed and many pages show as duplicate content. Is it possible that there is a site map issue that is causing this problem? My developer believes the map is formatted correctly, but I am not convinced. The sitemap address is http://www.nyc-officespace-leader.com/page-sitemap.xml [^] I am completely non technical so if anyone could take a brief look I would appreciate it immensely. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan | |0 -
Link Research Tools - Detox Links
Hi, I was doing a little research on my link profile and came across a tool called "LinkRessearchTools.com". I bought a subscription and tried them out. Doing the report they advised a low risk but identified 78 Very High Risk to Deadly (are they venomous?) links, around 5% of total and advised removing them. They also advised of many suspicious and low risk links but these seem to be because they have no knowledge of them so default to a negative it seems. So before I do anything rash and start removing my Deadly links, I was wondering if anyone had a). used them and recommend them b). recommend detoxing removing the deadly links c). would there be any cases in which so called Deadly links being removed cause more problems than solve. Such as maintaining a normal looking profile as everyone would be likely to have bad links etc... (although my thinking may be out on that one...). What do you think? Adam
Intermediate & Advanced SEO | | NaescentAdam0 -
How do I find the links on my site that link to another one of my pages?
I ran IIS Seo toolkit and it found about 40 pages that I have no idea how they exist. What tool can I use to find out what internal link is linking to them so I can fix them or get rid of them?
Intermediate & Advanced SEO | | EcommerceSite0 -
Dynamic Links vs Static Links
There are under 100 pages that we are trying to rank for and we'd like to flatten our site architecture to give them more link juice. One of the methods that is currently in place now is a widget that dynamically links to these pages based on page popularity...the list of links could change day to day. We are thinking of redesigning the page to become more static, as we believe it's better for link juice to flow to those pages reliably than dynamically. Before we do so, we need a second opinion.
Intermediate & Advanced SEO | | RBA0