Does PR/linkjuice get transfered from directory->subdirectory?
-
If website.com links to website.com/utilities
Does website.com/utilities/business.html pagerank get benefited from this?
The reason I ask is, I have the option to have:
1) website.com/utilities/business.html
or
2) website.com/utilities-business
My boss says would like to keep the first example and link directory to subdirectory --> File.
I prefer the second example because the potential traffic in it is a source of revenue and I want it to get linkpower from homepage directly.
So, unless linkjuice is transfered horizontally across subdirectories, 2) would be better SEO-wise, is this correct?
Thanks a lot.
-
Hi Jorge,
Good question! If I understand correctly, you have a choice between two file structures - is that right? Using option 1, is there anything that would actually live on website.com/utilities/, or is it simply a URL directory with no actual webpage?
If this is the case, both option 1 and 2 would pass the same amount of link juice, assuming you linked directly to each target. That said, it's usually desirable to have:
- A flat architecture - meaning you keep your structure as flat as possible using the fewest amount of directories as possible
- Shorter URLs are generally correlated to better rankings and click-through rates.
For this reason, if you are linking directly to each target, I would chose the second option if possible, although the difference it makes probably isn't that great.
Hope this helps! Best of luck with your SEOl
-
Hi Jorge,
Option 1 would be good if you have a lot of sub pages that can be classified under the "utilities" section. This would also help the search engines to index and classify your site pages better also benefitting you on the SEO side
Option 2 would be recommended if you don't have a lot of content that can be classified under "utilities"
Think about the user first and what would be benefit them. To directly answer your question, in my opinion , the closer the link to the home page the better it would be from an SEO perspective.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
does <base> in html affect seo?
hey, just wanna know does <base> in head of website affect SEO? and if it's a yes, how?
Technical SEO | | m17001 -
Http vs https: which is better for seo / usability
Hi All, Firstly thank you for taking the time to look. My dilemma is as follows; I have a site on wordpress that I have added an ssl certificate to and the entire domain is secure. The site has a mix of content including a blog area and product pages. My question is what does Google prefer, http or https or does it not matter As i see it my option is to keep the entire site as https and enforce this sitewide so all non secure content redirects to the https version or i could enforce https just in the cart and or product pages, all other content, homepage, blog, about us, contact us etc would be http. From an seo perspective ie google search engine, is their a best way to proceed? Finally, as i currently have http and https both displaying ie duplicate, what would be the way to fix this, i have yoast plugin so can set the canonical there and can also edit my robot.txt. I have come across this resource (http://www.creare.co.uk/http-vs-https-duplicate-content) and am wondering if this guideline is still correct or is there another more current way, if so I would be grateful if you could point me in the right direction. thanks in advance.
Technical SEO | | Renford_Nelson0 -
Blog zero PR
Hi I am trying to establish why my main pages on my site has a PR2 but my Blog is PR0. Even though the traffic is spread between the blog and the main site. I also want to get to PR3 but I seem to never get close... Any advise would be very much appreciated...
Technical SEO | | Cocoonfxmedia0 -
Duplicate content / title caused by CAPITALS
What is the best way to stop duplicate content warning (and Google classing them as duplicate content), when it is caused by CAPITALS (i.e www.domain.com/Directory & www.domain.com/directory ). I try to always use lower case (unless a place name then i use Capitals for the first letter), but it looks like i have slipped up and got some mixed up and other sites will also be linking to Capitals Thanks Jon
Technical SEO | | jonny5123790 -
Getting More Pages Indexed
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed. Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed. syGtx.png
Technical SEO | | Mattchstick0 -
Some backlinks not getting picked up by OSE -- confused to why
I've had what has ended up being a nightmare SEO campaign. We have sunk a ton of time into link building for the site and saw little to no traction. Just as a test, I have the site on our portfolio as a site we designed years ago. The link to the site is old. Somehow, this link does not show up on OSE, and it seems to get no credit. http://www.happydogwebproductions.com/item/spine-and-sports.html The backlink, and troubled campaign is www.spineandsportschiro.com There should be a few more links too that aren't getting picked up, especially the ones I have noticed using Whitespark local citation tool. Has Google got picky with all of this? How come this site seems to be getting no credit?
Technical SEO | | Boogily0 -
Getting a bunch of pages re-crawled?
I added noindex tags to a bunch (1,000+) of paginated category pages on my site. I want Google to recrawl the pages so they will de-index them. Any ideas to speed up the process?
Technical SEO | | AdamThompson0 -
Getting Google to index new pages
I have a site, called SiteB that has 200 pages of new, unique content. I made a table of contents (TOC) page on SiteB that points to about 50 pages of SiteB content. I would like to get SiteB's TOC page crawled and indexed by Google, as well as all the pages it points to. I submitted the TOC to Pingler 24 hours ago and from the logs I see the Googlebot visited the TOC page but it did not crawl any of the 50 pages that are linked to from the TOC. I do not have a robots.txt file on SiteB. There are no robot meta tags (nofollow, noindex). There are no 'rel=nofollow' attributes on the links. Why would Google crawl the TOC (when I Pinglered it) but not crawl any of the links on that page? One other fact, and I don't know if this matters, but SiteB lives on a subdomain and the URLs contain numbers, like this: http://subdomain.domain.com/category/34404 Yes, I know that the number part is suboptimal from an SEO point of view. I'm working on that, too. But first wanted to figure out why Google isn't crawling the TOC. The site is new and so hasn't been penalized by Google. Thanks for any ideas...
Technical SEO | | scanlin0