Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does page "depth" matter
-
Would it have a negative effect on SEO to have a link from the home page to this page...
http://www.website/com/page1deep/page2deep
rather than to this page
http://www.website/com/page1deep
I'm hoping that made some sense. If not I'll try to clarify.
Thanks,
Mark
-
I had a quick scan thought the article and it looks likethey are talking from a usability aspect. I am talking from a link juice aspect. Page rank passes only 85% thought a link, so if the home page passes PR of 1 thought each of its likes, the a page one click away only gets 0.85, 2 clicks 0.72, 3 clicks 0.61, 4 clicks 0.52.
It gets a bit more complicated when the pages link back to the home page, theough dopnt pass back as much either.
i have a simple explaination here.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
And her is a caculator that you can ry to see how it workes out.
-
"if page domain.com/rootpage.htm takes 4 clicks to get to it from the home page then you have a problem."
I've always believed it's best to keep the clicks down, but I recently read some research that shows many clicks are not necessarily a problem: http://www.uie.com/articles/three_click_rule/ - though the research is from 2003 and I'd say people have become more impatient and expectant of more instant results since then, it's still interesting and proves that 3 clicks doesn't have to be a rule.
There's also a useful article about "the scent of information" here: http://searchengineland.com/seo-and-the-scent-of-information-26206
-
Thank you Geoff, Casey and Alan.
Great answers and exactly what I needed to know.
-
Its not how many folders deep your pages is, but how many clicks from the home page it is.
if page domain.com/deep/deep/deep/deeppage.htm is linked from the home page then thats ok.
if page domain.com/rootpage.htm takes 4 clicks to get to it from the home page then you have a problem.
-
I might have misunderstood the question but what really matters is how you get to the content not necessarily the URL structure (relevancy required - no spam please).
Given this "freshness" aspect i.e. recent links / social shares will enhance the opportunities for this particular page to appear in the results (given other SEO boxes are "ticked"), the "extra" folders will not matter.
-
Hi Mark,
This will not hurt your SEO at all, here is a video from Matt Cutts explaining the issue: http://www.youtube.com/watch?v=l_A1iRY6XTM
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang Errors 404 vs "Page Not Found"
For a websites that differ between catalogs (PDPs) what hreflang error causes the least harm? Obviously the best solution is to only have hreflang for shared products, but this takes more work to implement. So when no identical product exists... 1. Hreflang points to 404 or 410 error. 2. Hreflang points to 200 status "Page Not Found" page. This obviously has the additional issue of needing to point back to 100+ urls. I want to avoid having Google decide to ignore all hreflang due to errors as many correct urls will exist. Any thoughts?
On-Page Optimization | | rigelcable0 -
Will it upset Google if I aggregate product page reviews up into a product category page?
We have reviews on our product pages and we are considering averaging those reviews out and putting them on specific category pages in order for the average product ratings to be displayed in search results. Each averaged category review would be only for the products within it's category, and all reviews are from users of the site, no 3rd party reviews. For example, averaging the reviews from all of our boxes products pages, and listing that average review on the boxes category page. My question is, will this be doing anything wrong in the eyes of Google, and if so how so? -Derick
On-Page Optimization | | Deluxe0 -
Noindex child pages (whose content is included on parent pages)?
I'm sorry if there have been questions close to this before... I've using WordPress less like a blogging platform and more like a CMS for years now... For content management purposes we organize a lot of content around Parent/Child page (and custom-post-type) relationships; the Child pages are included as tabbed content on the Parent page. Should I be noindexing these child pages, since their content is already on the site, in full, on their Parent pages (ie. duplicate content)? Or does it not matter, since the crawlers may not go to all of the tabbed content? None of the pages have shown up in Moz's "High Priority Issues" as duplicate content but it still seems like I'm making the Parent pages suffer needlessly... Anything obvious I'm not taking into consideration? By the by, this is my first post here @ Moz, which I'm loving; this site and the forums are such a great resource! Anyways, thanks in advance!
On-Page Optimization | | rsigg0 -
Is there a SEO penalty for multi links on same page going to same destination page?
Hi, Just a quick note. I hope you are able to assist. To cut a long story short, on the page below http://www.bookbluemountains.com.au/ -> Features Specials & Packages (middle column) we have 3 links per special going to the same page.
On-Page Optimization | | daveupton
1. Header is linked
2. Click on image link - currently with a no follow
3. 'More info' under the description paragraph is linked too - currently with a no follow Two arguments are as follows:
1. The reason we do not follow all 3 links is to reduce too many links which may appear spammy to Google. 2. Counter argument:
The point above has some validity, However, using no follow is basically telling the search engines that the webmaster “does not trust or doesn’t take responsibility” for what is behind the link, something you don’t want to do within your own website. There is no penalty as such for having too many links, the search engines will generally not worry after a certain number.. nothing that would concern this business though. I would suggest changing the no follow links a.s.a.p. Could you please advise thoughts. Many thanks Dave Upton [long signature removed by staff]0 -
Creating New Pages Versus Improving Existing Pages
What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)?
On-Page Optimization | | SparkplugDigital0 -
What does the "base href" meta tag do? For SEO and webdesign?
I have encounter the "base href" on one of my sites. The tag is on every page and always points to the home URL.
On-Page Optimization | | jmansd0 -
Does Code Order Matter?
I read/was told that it was a good idea to order your HTML to show the most important content first. So, on many sites I had put my global navigation div, for instance, below my main content div. Does this still apply? And does wise use of HTML 5 mean this is no longer necessary (eg use of "nav" section tag to indicate this section is about navigation). In the same vein, how does Google know that my sidebar nav is my sidebar nav (which your site seems to say is probably given less weight than top nav), and how does it know my topnav is my top nav? Maybe a daft question, but when someone asked me yesterday I realised I didn't know! (Phew - at last I have asked a short question!).
On-Page Optimization | | PeterMurray0 -
Page speed tools
Working on reducing page load time, since that is one of the ranking factors that Google uses. I've been using Page Speed FireFox plugin (requires FireBug), which is free. Pretty happy with it but wondering if others have pointers to good tools for this task. Thanks...
On-Page Optimization | | scanlin0