What is the optimal URL Structure for Internal Pages
-
Is it more SEO friendly to have an internal page URL structure that reads like www.smithlawfirm.com/personal-injury/car-accidents or www.smithlawfirm.com/personal-injury-car-accidents? The former structure has the benefit of showing Google all the sub-categories under personal injury; the later the benefit of a flatter structure. Thanks
-
Thanks, I had read this post before adding my question.
-
The URL structure can be helpful for analysis when looking at your analytics. See this post at LunaMetrics for reasons why you might want a directory structure. http://www.lunametrics.com/blog/2010/09/22/designing-google-analytics-friendly-site/
-
Category wise structure is alwaz a better option.
If your site is small than you can keep all pages in root folder, but as your site grows it would be very difficult for you to manage all pages on root & google will treat all pages at same level
go for 1st option.
-
I agree with John and oznappies.
A logical category system is very useful for your site admins, users and SEO. There are many benefits to a simple /personal-injury/car-accidents design.
Sites need to be balanced. A structure that is overly deep with categories is not desired, but an overly flat structure where every page is a child of the home page isn't going to provide the best user experience either.
-
It's shouldn't matter either way... though I would strongly advise that you're organizing your content appropriately. The directory structure will then build itself. Slashes or dashes? I would build it with the slashes.
Keep it human readable and you'll be in good shape
-
Since you will most likely have more than one form of personal injury, it would make more sense for a site architecure point of view to use category/type model ie. personal-injury/car-accidents. There probably is not any ranking difference, except that you could have a personal-injury landing page that links to the injury types and gains link juice in it's own right.
-
Rafi, I handle a few law firms that are involved in PI. I will give you an example of a site that performs very well: Actos-Lawsuit.org. If you look at our url structure you will see two things, hyphens and flat. You are asking about the firm site I am assuming. Obviously, you don't want more than three steps to any page. Within that context, I still believe the flatter the better. To take something from someone else a few months back regarding hyphens, look at the url on the page you are now viewing and what do you see? My suggestion is yes for the keeping it flat and absolutely yes for hyphens. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do pages with low PA effect DA?
I was looking into raising my firm's Domain Authority and I had a thought. I was wondering if our very low PA pages are bringing the overall DA down? Our homepage is at 43 currently and the DA is 32. We have quite a few pages and I am trimming a little fat in the deep pages currently. I was wondering what would be the result in terms of DA I am also starting an aggressive initiative to blog more and try to attract links through guest posting and HARO (Hep A Reporter Out). I understand that many people will say DA is not a metric your should necessarily build around. But, while I am fighting for rankings in a very competitive vertical, I assume a higher DA is better, no? From everything I have read on Moz over the years, they say that the DA metric is the one that tries to encompass a multitude of factors, similar to the way something more complex like the Google algorithm does. So, I assumed finding small gains in DA could be beneficial to the site's rankings and traffic I tried to go into detail and get specific here because I know how many bad questions are asked daily. Thanks anyone and everyone for the help, I do thoroughly appreciate the Moz community
Web Design | | BossArrighi0 -
Internal links, new pages & Domain Authority
I have two questions regarding Domain Authority: 1. Is it possible that a drop in Domain Authority may have been caused by adding a blog and blog posts? In other words, would adding pages/posts dilute the site's authority? And will it catch back up with itself or will that require inbound links to those new pages? (oops! that was 3 questions in one) 2. Would it be detrimental to have internal links coming from blog posts without authority to my Home page and could that have contributed to a drop in Domain Authority? Thanks!
Web Design | | gfiedel0 -
Is it better to redirect a url or set up a landing page for a new site?
Hi, One of our clients has got a new website but is still getting quite a lot of traffic to her old site which has a page authority of 30 on the home page and has about 20 external backlinks. It's on a different hosting package so a different C block but I was wondering if anyone could advise if it would be better to simply redirect this page to the new site or set up a landing page on this domain simply saying "Site has moved, you can now find us here..." sort of idea. Any advice would be much appreciated Thanks
Web Design | | Will_Craig0 -
Too Many On Page Links, rel="nofollow" and rel="external"
Hi, Though similar to other questions on here I haven't found any other examples of sites in the same position as mine. It's an e-commerce site for mobile phones that has product pages for each phone we sell. Each tariff that is available on each phone links through to the checkout/transfer page on the respective mobile phone network. Therefore when the networks offer 62 different tariffs that are available on a single phone that means we automatically start with 62 on page links that helps to quickly tip us over the 100 link threshold. Currently, we mark these up as rel="external" but I'm wondering if there isn't a better way to help the situation and prevent us being penalised for having too many links on page so: Can/should we mark these up as rel="nofollow" instead of, or as well as, rel="external"? Is it inherently a problem from a technical SEO point of view? Does anyone have any similar experiences or examples that might help myself or others? As always, any help or advice would be much appreciated 🙂
Web Design | | Tinhat0 -
Duplicate Page Content mysite.com and mysite.com/index.html MOZ Dashboard
According to MOZ Dashboard my site shows Duplicate Page Content mysite.com and mysite.com/index.html .What i can do for that .redirect mysite.com/index.html to mysite.com .then how can i do that using .htaccess file .
Web Design | | innofidelity0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0 -
Flag page elements to not be loaded by Instapaper and co.
Does anybody know if there is a way to mark certain elements (especially navigation menus) so that instapaper and co don't pull them? I'm looking for a quick solution (best would be if it was CSS based) nothing fancy like parsing the user-agent. That would be plan B. I've added role="navigation" id="navigation" and class="navigation" to the nav elements in hope that it would work. Seems like it does not; sometimes the elements are present in the page generated by instapaper, sometimes not. Thank you for any replies and have a great day! Jan
Web Design | | jmueller0 -
How not to get penalized by having a Single Page Interface (SPI) ?
Guys, I run a real estate website where my clients pay me to advertise their properties. The thing is, from the beginning, I had this idea about a user interface that would remain entirely on the same page. On my site the user can filter the properties on the left panel, and the listings (4 properties at each time) are refreshed on the right side, where there is pagination. So when the user clicks on one property ad, the ad is loaded by ajax below the search panel in the same page .. there's a "back up" button that the user clicks to go back to the search panel and click on another property. People are loving our implementation and the user experience, so I simply can't let go of this UI "inovation" just for SEO, because it really is something that makes us stand out from our competitors. My question, then, is: how not to get penalized in SEO by having this Single Page Interface, because in the eyes of Google users might not be browsing my site deep enough ?
Web Design | | pqdbr0