How many inner links on one page?
-
I have seen Matt Cutts video about links per page and know that too many links "may" harm the flow of link juice. But what should e-commerce sites do?
We have category pages with more than a few thousands products in each of them. So linking to each of them dilutes the PR flow?
We could use pagination, but doesn't it give a disadvantage in user experience when he needs to go 10 links deep to reach a product? And Google robots won't update the information frequently because it will be on the lowest part of our site?
Now our goal is to make all our products appear like Facebook scroll down page. We know that Google doesn't use Ajax to see more links so robots and all the users that don't have JavaScript could see the paginated results. Is it a good way to put all products and links like this?
-
But rel=canonical is offered only for search engines to learn whether the page is a duplicate or an original.
Maybe there's someone else that could tell their experience about this topic?
-
There is always a chance it could be seen as cloaking, but I think this instance is a legitimate reason to do it. You aren't blatantly trying to manipulate the search engines for your gain. You are trying to help the users first and please the search engines second, so I think it should work fine.
-
We thought about using rel=next/prev and rel=canonical. But would html paginated webpages showing that all the same products are on category main page (where Google bot can't expand to see them) would lie to Google.
And it would be like showing that all my stuff is one some page that bots can't see them. Isn't it called cloaking?
-
I would suggest using Pagination. If you were able to allow the visitor to filter by best selling, most reviewed, newest, lowest price, etc, that might solve the potential problem of having multiple pages if you do decide to use pagination.
However, if you are planning to use Ajax to show the paginated results, that could be a good solution, but the pagination navigation would somehow have to be in regular HTML so the search engines could still spider it.
I'd also recommend using rel=canonical rel=prev rel=next if you do use pagination.
Scott O.
P.S. If you feel I have been helpful, please consider marking this as "Good Answer".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dynamic referenced canonical pages based on IP region and link equity question
Hi all, My website uses relative URLs that has PHP to read a users IP address, and update the page's referenced canonical tag to an region specific absolute URL for ranking / search results. E.g. www.example.com/category/product - relative URL referenced for internal links / external linkbuilding If a US IP address hits this link, the URL is the same, but canonicalisation is updated in the source to reference www.example.com**/us/**category/product, so all ranking considerations are pointed to that page instead. None of these region specific pages are actually used internally within the site. This decision was done so external links / blog content would fit a user no matter where they were coming from. I'm assuming this is an issue in trying to pass link equity with Googlebot, because it is splitting the strength between different absolute canonical pages depending on what IP it's using to crawl said links (as the relative URL will dynamically alter the canonical reference which is what ranking in SERPs) Any assistance or information no matter how small would be invaluable. Thanks!
Intermediate & Advanced SEO | | MattBassos0 -
Benefit of getting more then one link from a site?
Hi Guys, Is there any benefit (purely for SEO - link building) to getting more than one link from a domain if you already have 1? From my understanding, even if you had 100 links from a domain, that really counts as 1 for link authority. So from this, i don't see much point in getting more then 1 link. Is this a right assumption? Cheers.
Intermediate & Advanced SEO | | wozniak650 -
One Page Design / Single Product Page
I have been working in a project. Create a framework for multi pages that I have So here is the case
Intermediate & Advanced SEO | | Roman-Delcarmen
Most of them are single page product / one page design wich means that I dont have many pages to optimize. All this sites/ pages follow the rules of a landing page optimization because my main goals is convert as many users as I can. At this point I need to optimize the SEO, the basic stuff such as header, descriptions, tittles ect. But most of my traffic is generated by affiliates, which is good beacuse I dont have to worrie to generate traffic but if the affiliate network banned my product, then I lose all my traffic. Put all my eggs in the same basket is not a good idea. Im not an seo guru so that is the reason Im asking whic strategies and tactics can give me results. All kind of ideas are welcome1 -
Do links to a domain that re-directs to my domain pass link equity?
Hi guys. We've recently taken control of a third-party site and we're going to set up a domain re-direct so any traffic comes to our site. With any existing links that the third-party site has, will these pass link equity to our main site through the redirect? Thanks, Paul
Intermediate & Advanced SEO | | kevinliao0 -
I want to Disavow some more links - but I'm only allowed one .txt file?
Hey guys, Wondering if you good people could help me out on this one? A few months back (June 19) I disavowed some links for a client having uploaded a .txt file with the offending domains attached. However, recently I've noticed some more dodgy-looking domains being indexed to my client's site so went about creating a new "Disavow List". When I went to upload this new list I was informed that I would be replacing the existing file. So, my question is, what do I do here? Make a new list with both old and new domains that I plan on disavowing and replace the existing one? Or; Just replace the existing .txt file with the new file because Google has recognised I've already disavowed those older links?
Intermediate & Advanced SEO | | Webrevolve0 -
Best practices for robotx.txt -- allow one page but not the others?
So, we have a page, like domain.com/searchhere, but results are being crawled (and shouldn't be), results look like domain.com/searchhere?query1. If I block /searchhere? will it block users from crawling the single page /searchere (because I still want that page to be indexed). What is the recommended best practice for this?
Intermediate & Advanced SEO | | nicole.healthline0 -
3 results for a site on page one?!?
Hi, I've never seen a website rank on page 1 in position 2, 3 and 4 for one query, completely separate results as well. I thought they limited the amount of results from a website on each page?
Intermediate & Advanced SEO | | activitysuper0 -
Noindex junk pages with inbound links?
I recently came across what is to me a new SEO problem. A site I consult with has some thin pages with a handful of ads at the top, some relevant local content sourced from a third party beneath that... and a bunch of inbound links to said pages. Not just any links, but links from powerful news sites. My impression is that said links are paid (sidebar links, anchor text... nice number of footprints.) Short version: They may be getting juice from these links. A preliminary lookup for one page's keywords in the title finds it top 100 on Google. I don't want to lose that juice, but do think the thin pages they link to can incur Panda's filter. They've got the same blurb for lots of [topic x] in [city y], plus the sourced content (not original...). So I'm thinking about noindexing said pages to avoid Panda filters. Also, as a future pre-emptive measure, I'm considering figuring out what they did to get these links and aiming to have them removed if they were really paid for. If it was a biz dev deal, I'm open to leaving them up, but that possibility seems unlikely. What would you do? One of the options I laid out above or something else? Why? p.s. I'm asking this on my blog (seoroi.com/blog/ ) too, so if you're up for me to quote you (and link to your site, do say so. You aren't guaranteed to be quoted if you answer here, but it's one of the easier ways you'll get a good quality link. p.p.s. Related note: I'm looking for intermediate to advanced guest posts for my blog, which has 2000+ RSS subs. Email me at gab@ my site if you're interested. You can also PM me here on SEOmoz, though I don't login as frequently.
Intermediate & Advanced SEO | | Gab-Goldenberg0