Single Folder vs Root
-
I'm working on a multi-state attorney website and I'm going back and forth on URL's. I thought I'd see what the community thinks.
lawsite.com/los-angeles/car-accident-lawyer vs. lawsite.com/los-angeles-car-accident-lawyer
I should note this site will have over a dozen city locations, with different practices.
-
My Friend,
I think that is fine. I would do that.
I wish you all the best in your project!
-
Dont overblow it really. I'm working on that too right now with positive effects, i.e /subject/another-subject/, it would be good if you link all the independent pages from /subject/ as well including a dropdown menu on /subject/ with all /another-subjects/.
-
Agreed, thanks!
-
Thanks for the great reply. Yes, quite a few practice areas. So it sounds like I should go the city folder route.
Follow up question; think I should do /westcehster-attorney/slip-and-fall-accident-lawyer, or am I getting a little spammy?
-
I recommend Joseph's approach. There are many benefits to this approach: manageability, scalability, and seo. You can address all the practice areas available in specific locations as well as rank the firm more strongly in each location by key of relevance.
-
Hello Friend,
Good question.
Are they only doing car accident cases? I assume that they are doing more.
Doing a folder for the city will allow you to create a hub city page that should link out to different practices for that city, and they should all link back to support the hub page. See how they did it.
https://mirmanlawyers.com/westchester/ (tier 2, pillar page, hub page)
https://mirmanlawyers.com/westchester/car-accident-lawyer/
https://mirmanlawyers.com/westchester/slip-and-fall-accident-lawyer/
If you only have one practice to focus one, I suggest you go for the. lawsite.com/los-angeles-car-accident-lawyer, but if you have many practices, I would go for lawsite.com/los-angeles/car-accident-lawyer and create a valuable sub-page for each practice and each location.
I wish you the best of luck with your project!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Onsite Videos- Multiple on a Single Page- How to Optimize?
I have a specific page which needs multiple videos. A primary video of the client (Youtube Video) and two secondary videos with patient testimonials (Wistia Videos). Here is the actual page: https://www.johnbarrasdds.com/houston-tmj-dentist/ My understanding is Google only values the first video on a page. Is this accurate and either way what is the best practice for how to post the second group of videos and gain SEO value? Thanks!
Intermediate & Advanced SEO | | mgordon0 -
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
Ecommerce: A product in multiple categories with a canonical to create a ‘cluster’ in one primary category Vs. a single listing at root level with dynamic breadcrumb.
OK – bear with me on this… I am working on some pretty large ecommerce websites (50,000 + products) where it is appropriate for some individual products to be placed within multiple categories / sub-categories. For example, a Red Polo T-shirt could be placed within: Men’s > T-shirts >
Intermediate & Advanced SEO | | AbsoluteDesign
Men’s > T-shirts > Red T-shirts
Men’s > T-shirts > Polo T-shirts
Men’s > Sale > T-shirts
Etc. We’re getting great organic results for our general T-shirt page (for example) by clustering creative content within its structure – Top 10 tips on wearing a t-shirt (obviously not, but you get the idea). My instinct tells me to replicate this with products too. So, of all the location mentioned above, make sure all polo shirts (no matter what colour) have a canonical set within Men’s > T-shirts > Polo T-shirts. The presumption is that this will help build the authority of the Polo T-shirts page – this obviously presumes “Polo Shirts” get more search volume than “Red T-shirts”. My presumption why this is the best option is because it is very difficult to manage, particularly with a large inventory. And, from experience, taking the time and being meticulous when it comes to SEO is the only way to achieve success. From an administration point of view, it is a lot easier to have all product URLs at the root level and develop a dynamic breadcrumb trail – so all roads can lead to that one instance of the product. There's No need for canonicals; no need for ecommerce managers to remember which primary category to assign product types to; keeping everything at root level also means there no reason to worry about redirects if product move from sub-category to sub-category etc. What do you think is the best approach? Do 1000s of canonicals and redirect look ‘messy’ to a search engine overtime? Any thoughts and insights greatly received.0 -
Is 301 redirecting your index page to the root '/' safe to do or do you end up in an endless loop?
Hi I need to tidy up my home page a little, I have some links to our index.html page but I just want them to go to the root '/' so I thought I could 301 redirect it. However is this safe to do? I'm getting duplicate page notifications in my analytic reportings tools about the home page and need a quick way to fix this issue. Many thanks in advance David
Intermediate & Advanced SEO | | David-E-Carey0 -
Easiest way to disavow single links on an on-going basis?
We frequently get random super-sketchy looking blogs linking to us with no author or contact information. I believe we are being targeted by a competitor setting up garbage links to us. I am hoping to use the Google disavow links tool to deal with this but is it: Safe to use or does it flag us as link spammers by using it Possible to use on an on-going basis for single links (as them come in, as opposed to a bunch of backlogged links) Thanks!
Intermediate & Advanced SEO | | BlueLinkERP0 -
Sub Domains vs. Persistent URLs
I've always been under the assumption that when building a micro-site it was better to use a true path (e.g. yourcompany.com/microsite) URL as opposed to a sub domain (microsite.yourcompany.com) from an SEO perspective. Can you still generate significant SEO gains from a sub domain if you were forced to use it providing the primary (e.g. yourcompany.com) had a lot of link clout/authority? Meaning, if I had to go the sub domain route would it be the end of the world?
Intermediate & Advanced SEO | | VERBInteractive0 -
One Way Links vs Two Way Links
Hi, Was speaking to a client today and got asked how damaging two way links are. i.e. domaina.com links to domainb.com and domainb.com links back to domaina.com. I need a nice simple layman's explanation of if/how damaging they are compared to one way links. And please don't answer with you lose link juice as I have a job explaining link juice.... I am explaining things to a non techie! Thank you!!
Intermediate & Advanced SEO | | JohnW-UK0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0