Link Juice - Lots of Pages
-
I have a site, PricesPrices.com where I'm steadily building inbound links and pagerank. I have about 4600 pages on the site, most of which are baby products in the baby gear sector. There are many outdated items that aren't really my focus, but do pop up in long-tail search queries from time to time.
My question is a pretty basic one. Theoretically if a site has say 28/100 link juice, then as you go deeper and deeper into the site, the link juice is divided more and more. My question: Is this really true or just a concept?
My thoughts are to hide many of the products that i don't really need to focus on therefor passing more link juice to the products that remain, but I also don't want to that if it won't necessarily make the remaining pages rank higher or have more link juice. I also have to keep in mind the merchandising aspect of the site and providing a good user experience. If i only have 300 products on the site, there will be a ton of unhappy people who can't find the products they are looking for.
Any thoughts and/or pointers in the direction of funneling that pagerank down into my site would be much appreciated. Thanks!
-
For one, do not use rel=nofollow on any internal links. Matt Cutts has said this in the plainest of language. Don't do it. PageRank sculpting like this does not work.
What you describe is true. If your home page has a certain amount of link juice, the more pages in your site and the more links per page will diminish the amount of link juice each page receives, at least from your own pages. You can increase the amount of pagerank of any page with external links.
At the end of the day the way you try to funnel around pagerank on your site is almost entirely meaningless. Unless you have a ton of pagerank, you will put in a lot of work redoing your architecture for nothing. Think of your users first and what makes the best user experience. That's the most important.
From there, simply get some external links to your most important pages. That's what I would recommend for this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
D.A. and Link Juice from certain websites
Hello, i'm a bit confused concerning some link buildings. What happens with backlinks from blogs coming from powerful domains such as abc.yahoo.com or abc**.over-blog.com or abc.blogspot.com?** Meaning, everyone that creates a blog over there will have a PA of 0/100 but a DA of 80 or 90/100.
Intermediate & Advanced SEO | | prozis
Will Google consider DA on these cases? I'm confused because it can't be that simple that someone creates a website, and after some months they will have like 10 PA but still the 90/100 DA and their links can be a powerful backlink. Can you explain me how Google sees that? So, if I have a link coming from a blog on those domains will it be better than any other with same PA but lower DA?0 -
301 from old site to new one , Should I point to home page or sub category page ?
Hey Seo Experts, I have a small website ranking for few terms like cabinets sale, buy etc . However what i have now decided is to launch a New website with more different products like living room furniture, wardrobes etc . Out of all these categories on new website Cabinets is one of the SubCategory . Now I do not want to have 2 websites . So wanted to 301 from small cabinets website to newly created website. Some of the doubts I have at the moment is ? 1 Should I REDIRECT 301 to sub category (i,e cabinets) which is purely related to Cabinets or Do a Redirect to HOME PAGE . As I also need more Authority to home page as well , as this is relatively new website ? 2 Second question related to this. If you have multiple sub domains does it divide the total authority & TF.Or it is just Ok to have multiple Sub domains if needed ? Any advice appreciated !! Thanks .
Intermediate & Advanced SEO | | aus00070 -
First Link on Page Still Only Link on Page?
Bruce Clay and others did some research and found that the first link on the page is the most important and what is accredited as the link. Any other links on the page mean nothing. Is this still true? And in that case, on an ecommerce site with category links in the top navigation (which is high on the code), is it not useful to link to categories in the content of the page? Because the category is already linked to on that page. Thank you, Tyler
Intermediate & Advanced SEO | | tylerfraser0 -
Which page to target? Home or /landing-page
I have optimized my home page for the keyword "computer repairs" would I be better of targeting my links at this page or an additional page (which already exists) called /repairs it's possible to rename & 301 this page to /computer-repairs The only advantage I can see from targeting /computer-repairs is that the keywords are in the target URL.
Intermediate & Advanced SEO | | SEOKeith0 -
Our site is recieving traffic for both .com/page and .com/page/ with the trailing slash.
Our site is recieving traffic for both .com/page and .com/page/ with the trailing slash. Should we rewrite to just the trailing slash or without because of duplicates. The other question is, if we do a rewrite, google has indexed some pages with the slash and some without - i am assuming we will lose rank for one of them once we do the rewrite, correct?
Intermediate & Advanced SEO | | Profero0 -
How to retain link juice moving to new site, cms and servers?
We have been hosting our website with a provider (their design and CMS) and we are now moving to a new design, better content focussing on keywords in a different CMS platform on different servers but want to retain the link juice from the old site. We have used Open Site Explorer Report to determine all the links to the old site and the pages they link to. What is the best strategy to keep the link juice flowing to the new site? Example This site <http: www.dogslifedownunder.com="" what-is-worse-then-going-to-the-v-e-t="">links to this page <http: 19105="" www.sydneyanimalhospitals.com.au="" ourstaff="" thevets="" tabid="" default.aspx="">on the old site.</http:></http:> We will have a similar page on the new site with the same staff members called for example: How do we ensure that the we retain the link juice? Any thoughts most welcome.
Intermediate & Advanced SEO | | Peter.Huxley590 -
Does 302 pass link juice?
Hi! We have our content under two subdomains, one for the English language and one for Spanish. Depending on the language of the browser, there's a 302 redirecting to one of this subdomains. However, our main domain (which has no content) is receiving a lot of links - people rather link to mydomain.com than to en.mydomain.com. Does the 302 passing any link juice? If so, to which subdomain? Thank you!
Intermediate & Advanced SEO | | bodaclick0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0