Single Folder vs Root
-
I'm working on a multi-state attorney website and I'm going back and forth on URL's. I thought I'd see what the community thinks.
lawsite.com/los-angeles/car-accident-lawyer vs. lawsite.com/los-angeles-car-accident-lawyer
I should note this site will have over a dozen city locations, with different practices.
-
My Friend,
I think that is fine. I would do that.
I wish you all the best in your project!
-
Dont overblow it really. I'm working on that too right now with positive effects, i.e /subject/another-subject/, it would be good if you link all the independent pages from /subject/ as well including a dropdown menu on /subject/ with all /another-subjects/.
-
Agreed, thanks!
-
Thanks for the great reply. Yes, quite a few practice areas. So it sounds like I should go the city folder route.
Follow up question; think I should do /westcehster-attorney/slip-and-fall-accident-lawyer, or am I getting a little spammy?
-
I recommend Joseph's approach. There are many benefits to this approach: manageability, scalability, and seo. You can address all the practice areas available in specific locations as well as rank the firm more strongly in each location by key of relevance.
-
Hello Friend,
Good question.
Are they only doing car accident cases? I assume that they are doing more.
Doing a folder for the city will allow you to create a hub city page that should link out to different practices for that city, and they should all link back to support the hub page. See how they did it.
https://mirmanlawyers.com/westchester/ (tier 2, pillar page, hub page)
https://mirmanlawyers.com/westchester/car-accident-lawyer/
https://mirmanlawyers.com/westchester/slip-and-fall-accident-lawyer/
If you only have one practice to focus one, I suggest you go for the. lawsite.com/los-angeles-car-accident-lawyer, but if you have many practices, I would go for lawsite.com/los-angeles/car-accident-lawyer and create a valuable sub-page for each practice and each location.
I wish you the best of luck with your project!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Too Long vs. 301 Redirect
We have a small number of content pages where the urls paths were setup before we started looking really hard at SEO. The paths are longer than recommended (but not super crazy IMHO) and some of the pages get a decent amount of traffic. Moz suggests updating the URLs to make them shorter but I wonder if anyone has experience with the tradeoffs here. Is it better to mark those issues to be ignored and just use good URLs going forward or would you suggest updating the URLs to something shorter and implementing a 301 redirect?
Intermediate & Advanced SEO | | russell_ms0 -
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
Ranking Page - Category vs. Blog Post - What is best for CTR?
Hi, I am not sure wether I shall rank with a category page, or create a new post. Let me explain... If I google for 'Basic SEO' I see an article from Rand with Authorship markup. That's cool so I can go straight to this result because I know there might be some good insight. BUT: 'Basic SEO' is also an category at MOZ an it is not ranking. On the other hand, if I google for 'advanced SEO' then the MOZ category for 'advanced SEO' is ranking. But there is no authorship image, so users are much less likely to click on that result. Now, I want to rank for a very important keyword for me (content keyword, not transactional). Therefor, I have a category called 'yoga exercises'. But shall I rather create an post about them only to increase CTR due to Google Authorship? I read in Google guidelines that Authorship on homepage an category pages are not appreciated. Hope you have some insights that can help me out.
Intermediate & Advanced SEO | | soralsokal0 -
TLDs vs ccTLDs?
*Was trying to get this question answered in another thread but someone marked it as "answered" and no more responses came. So the question is about best practices on TLDs vs ccTLDs. I have a .com TLD that has DA 39 which redirects to the localized ccTLDs .co.id and .com.sg that have DA 17. All link building has been done for the .com TLD. In terms of content, it sometimes overlaps as the same content shows up on both the ccTLDs. What is best practices here? It doesnt look like my ccTLDs are getting any juice from the TLD. Should I just take my ccTLDs and combine them into my TLD in subdomains? Will I see any benefits? Thanks V j3LWnOJ
Intermediate & Advanced SEO | | venkatraman0 -
Wordpress - Dynamic pages vs static pages
Hi, Our site has over 48,000 indexed links, with a good mix of pages, posts and dynamic pages. For the purposes of SEO and the recent talk of "fresh content" - would it be better to keep dynamic pages as they are or manually create static pages/ subpages. The one noticable downside with dynamic pages is that they arent picked up by any sitemap plugins, you need to manually create a separate sitemap just for these dynamic links. Any thoughts??
Intermediate & Advanced SEO | | danialniazi1 -
Root domain authority or page authority - which matters more
When does one matter more than the other? Any help would be greatly appreciated. Thanks! Matthew
Intermediate & Advanced SEO | | Mrupp440 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0 -
Skip root page for brandname domain and just forward to key-word URL document?
SEOMoz community, I wanted to get your guys' ideas around what I would consider an unorthodox but potentially effective approach: I am currently internationalizing a brandname domain by building up different CCTLD domains and assigning them to local server resources. In order to push the brand of the project, all international CCTLDs share the same root domain, however the main key word for each CCTLD is different. My question is essentially if it would make sense to configure the root landing page of each international project as http://www.<brandname>.<cctld>/<market_keyword></market_keyword></cctld></brandname> rather than the typical approach http://www.<brandname>.</brandname> This would allow me to get the market specific keyword into the landing page URL. The root domain would have a 301 redirect to the keyword landing page. Anyone has any experience with this approach? Does a true root domain URL get some sort of SEO bonus that would not justify the above approach or will the URL with the keyword in the path will have higher SEO power. Looking forward to your responses - even if its just projections/SEO gut feeling. Thanks /Thomas
Intermediate & Advanced SEO | | tomypro0