Single Folder vs Root
-
I'm working on a multi-state attorney website and I'm going back and forth on URL's. I thought I'd see what the community thinks.
lawsite.com/los-angeles/car-accident-lawyer vs. lawsite.com/los-angeles-car-accident-lawyer
I should note this site will have over a dozen city locations, with different practices.
-
My Friend,
I think that is fine. I would do that.
I wish you all the best in your project!
-
Dont overblow it really. I'm working on that too right now with positive effects, i.e /subject/another-subject/, it would be good if you link all the independent pages from /subject/ as well including a dropdown menu on /subject/ with all /another-subjects/.
-
Agreed, thanks!
-
Thanks for the great reply. Yes, quite a few practice areas. So it sounds like I should go the city folder route.
Follow up question; think I should do /westcehster-attorney/slip-and-fall-accident-lawyer, or am I getting a little spammy?
-
I recommend Joseph's approach. There are many benefits to this approach: manageability, scalability, and seo. You can address all the practice areas available in specific locations as well as rank the firm more strongly in each location by key of relevance.
-
Hello Friend,
Good question.
Are they only doing car accident cases? I assume that they are doing more.
Doing a folder for the city will allow you to create a hub city page that should link out to different practices for that city, and they should all link back to support the hub page. See how they did it.
https://mirmanlawyers.com/westchester/ (tier 2, pillar page, hub page)
https://mirmanlawyers.com/westchester/car-accident-lawyer/
https://mirmanlawyers.com/westchester/slip-and-fall-accident-lawyer/
If you only have one practice to focus one, I suggest you go for the. lawsite.com/los-angeles-car-accident-lawyer, but if you have many practices, I would go for lawsite.com/los-angeles/car-accident-lawyer and create a valuable sub-page for each practice and each location.
I wish you the best of luck with your project!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.com vs .co.uk
Hi, we are a UK based company and we have a lot of links from .com websites. Does the fact that they are .com or .co.uk affect the quality of the links for a UK website?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Does a root domain get SEO power from its subdomains?
Hi there! I'd appreciate your help with the following case: a) Current 10-year-old website (community) on root domain "example.com" (250,000 incoming quality-backlinks) will move to the new subdomain "newsub.example.com" (301 redirects to the new subdomain for all current subfolders) b) A new website (shop) will launch on the root domain "example.com" Question: Will the new website on "example.com" get SEO power from the old website on "newsub.example.com"? SEO power = linkjuice/authority/trust/history/etc. from the 250,000 backlinks. What I'm trying to achieve: Maintain the built-up SEO power for the root domain "example.com" Thanks for sharing your thoughts on this! P.S. Plenty has been written about subdomains inheriting from their root domains (so please don't share input on the subdomain vs. subfolder debate). But I can't find satisfactory info about the other way around (root domains inheriting from their subdomains), e.g. if wikia.com gets SEO power from its subdomains superman.wikia.com, starwars.wikia.com, etc.)
Intermediate & Advanced SEO | | ebebeb0 -
ExampleSite.com vs ExampleSite.com.br
What would you say to a client who is concerned he'd have to run around buying his .com.??? in alot of other countries. Thanks!
Intermediate & Advanced SEO | | 945010 -
Website URL Structure - keyword targeting on homepage vs internal pages
I have developed a few websites before where the homepage contains the content for the keywords I was targeting. This has been reasonably successful as I have found it easy enough to get links to the homepage. I am considering a new site in a totally different industry that I am thinking about structuring like this: mybrand.com (not necessarily targeting any keywords) mybrand.com/important-keyword-1/ (definitely want to target) mybrand.com/important-keyword-2 (equally important as 1st keyword) There will be several (30-ish) other pages targeting keywords but they are not as significant as the two mentioned above, more so they are about publishing informative information. The two important keywords are quite different but industry related. My questions are: should I be careful targeting keywords away from the homepage when the homepage gets the most links? Would I be better off building 2 different websites where the keyword content is captured in the homepage? Thanks,
Intermediate & Advanced SEO | | BGu0 -
Domain name with a single digit
I have an existing campervan hire website which is being redesigned, rebranded and renamed (including the domain). The website allows businesses and owners to list campervans for rent to customers. There are a huge amount of campervan hire companies and so not many relevant domain names. Also many are for sale as expensive premium domains. We want something different that will standout. We're thinking of https//:camper7.com as its short and we think it would work. Is it best to avoid using a single digit in a domain if it isnt a 2 or 4 or does it not matter if we think it could work as it would stand out? Any help or advice would be appreciated. thanks James
Intermediate & Advanced SEO | | Curran1 -
Using Meta Header vs Robots.txt
Hey Mozzers, I am working on a site that has search-friendly parameters for their faceted navigation, however this makes it difficult to identify the parameters in a robots.txt file. I know that using the robots.txt file is highly recommended and powerful, but I am not sure how to do this when facets are using common words such as sizes. For example, a filtered url may look like www.website.com/category/brand/small.html Brand and size are both facets. Brand is a great filter, and size is very relevant for shoppers, but many products include "small" in the url, so it is tough to isolate that filter in the robots.txt. (I hope that makes sense). I am able to identify problematic pages and edit the Meta Head so I can add on any page that is causing these duplicate issues. My question is, is this a good idea? I want bots to crawl the facets, but indexing all of the facets causes duplicate issues. Thoughts?
Intermediate & Advanced SEO | | evan890 -
Is it safe to publish 3 paid press releases in a single day?
Is it a safe bet to publish 3 PAID press releases (on PRleap.com) on the same website on the same day, each having about 10 links to different pages of the same website? I mean... will search engines spot something fishy is going out there?
Intermediate & Advanced SEO | | KS__0 -
Sitewide Vs HomePage Links For Network of Sites
I wanted to site wide link a few sites together as they are sort of in the same network of ownership and wanted some advice. 1X PR1
Intermediate & Advanced SEO | | upick-162391
2X PR2
2x PR3 Would it be best to just get home page links before the footer, the links will be within a paragraph of text OR Just site wide link them in the footer with a heading of "Our Shopping Network"0