Single Folder vs Root
-
I'm working on a multi-state attorney website and I'm going back and forth on URL's. I thought I'd see what the community thinks.
lawsite.com/los-angeles/car-accident-lawyer vs. lawsite.com/los-angeles-car-accident-lawyer
I should note this site will have over a dozen city locations, with different practices.
-
My Friend,
I think that is fine. I would do that.
I wish you all the best in your project!
-
Dont overblow it really. I'm working on that too right now with positive effects, i.e /subject/another-subject/, it would be good if you link all the independent pages from /subject/ as well including a dropdown menu on /subject/ with all /another-subjects/.
-
Agreed, thanks!
-
Thanks for the great reply. Yes, quite a few practice areas. So it sounds like I should go the city folder route.
Follow up question; think I should do /westcehster-attorney/slip-and-fall-accident-lawyer, or am I getting a little spammy?
-
I recommend Joseph's approach. There are many benefits to this approach: manageability, scalability, and seo. You can address all the practice areas available in specific locations as well as rank the firm more strongly in each location by key of relevance.
-
Hello Friend,
Good question.
Are they only doing car accident cases? I assume that they are doing more.
Doing a folder for the city will allow you to create a hub city page that should link out to different practices for that city, and they should all link back to support the hub page. See how they did it.
https://mirmanlawyers.com/westchester/ (tier 2, pillar page, hub page)
https://mirmanlawyers.com/westchester/car-accident-lawyer/
https://mirmanlawyers.com/westchester/slip-and-fall-accident-lawyer/
If you only have one practice to focus one, I suggest you go for the. lawsite.com/los-angeles-car-accident-lawyer, but if you have many practices, I would go for lawsite.com/los-angeles/car-accident-lawyer and create a valuable sub-page for each practice and each location.
I wish you the best of luck with your project!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 vs Canonical - With A Side of Partial URL Rewrite and Google URL Parameters-OH MY
Hi Everyone, I am in the middle of an SEO contract with a site that is partially HTML pages and the rest are PHP and part of an ecommerce system for digital delivery of college classes. I am working with a web developer that has worked with this site for many years. In the php pages, there are also 6 different parameters that are currently filtered by Google URL parameters in the old Google Search Console. When I came on board, part of the site was https and the remainder was not. Our first project was to move completely to https and it went well. 301 redirects were already in place from a few legacy sites they owned so the developer expanded the 301 redirects to move everything to https. Among those legacy sites is an old site that we don't want visible, but it is extensively linked to the new site and some of our top keywords are branded keywords that originated with that site. Developer says old site can go away, but people searching for it are still prevalent in search. Biggest part of this project is now to rewrite the dynamic urls of the product pages and the entry pages to the class pages. We attempted to use 301 redirects to redirect to the new url and prevent the draining of link juice. In the end, according to the developer, it just isn't going to be possible without losing all the existing link juice. So its lose all the link juice at once (a scary thought) or try canonicals. I am told canonicals would work - and we can switch to that. My questions are the following: 1. Does anyone know of a way that might make the 301's work with the URL rewrite? 2. With canonicals and Google parameters, are we safe to delete the parameters after we have ensures everything has a canonical url (parameter pages included)? 3. If we continue forward with 301's and lose all the existing links, since this only half of the pages in the site (if you don't count the parameter pages) and there are only a few links per page if that, how much of an impact would it have on the site and how can I avoid that impact? 4. Canonicals seem to be recommended heavily these days, would the canonical urls be a better way to go than sticking with 301's. Thank you all in advance for helping! I sincerely appreciate any insight you might have. Sue (aka Trudy)
Intermediate & Advanced SEO | | TStorm1 -
More internal links pointing to internal page vs homepage
I was looking at our GSC internal links section and I saw that we have 901 internal links going to our compare rates form and 890 going to our homepage. At the end of most of our content I add a call to action to our compare rates form. Is this SEO friendly or should I have more pointing to the homepage and less pointing to our compare rates page?
Intermediate & Advanced SEO | | LindsayE0 -
One Page Design / Single Product Page
I have been working in a project. Create a framework for multi pages that I have So here is the case
Intermediate & Advanced SEO | | Roman-Delcarmen
Most of them are single page product / one page design wich means that I dont have many pages to optimize. All this sites/ pages follow the rules of a landing page optimization because my main goals is convert as many users as I can. At this point I need to optimize the SEO, the basic stuff such as header, descriptions, tittles ect. But most of my traffic is generated by affiliates, which is good beacuse I dont have to worrie to generate traffic but if the affiliate network banned my product, then I lose all my traffic. Put all my eggs in the same basket is not a good idea. Im not an seo guru so that is the reason Im asking whic strategies and tactics can give me results. All kind of ideas are welcome1 -
Ajax Module Crawability vs. WMT Fetch & Render
Recently a module was built into the homepage to pull in content from an outside source via Ajax and I'm curious about the overall crawability of the content. In WMT, if I fetch & render the content it displays correctly, but if I view source all I am seeing is the empty container. Should I take additional steps so that the actual AJAX content displays in my source code, or am I "good" since the content does display correctly when I fetch & render?
Intermediate & Advanced SEO | | RosemarieReed0 -
Domain.com/keyword1.keyword2.html vs doamin.com/keyword1-keyword2.html
I was doing some research and saw this url structure in a website that was not ranking well and can't help but wonder was the url structure part of the problem as well it looks like this with a period between keywords. domain.com/keyword1.keyword2.html and was wondering if that is acceptable for search engines as opposed to the normal dashes like this expample ... domain.com/keyword1-keyword2-keyword3.html I have never noticed a period to separate words in a url before. Anyone have any experience with this ? Is this going to hurt possible rankings ? Thank you in advance, Joe
Intermediate & Advanced SEO | | jlane91 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0