Subdomain question for law firm in Indiana, Michigan, and New Mexico.
-
Hi Gang,
Our law firm has offices in the states of Indiana, Michigan, and New Mexico. Each state is governed by unique laws, and each state has its own "flavor," etc.
We currently are set up with the main site as:
http://www.2keller.com (Indiana)
Subdomains as:
http://michigan.2keller.com (Michigan)
http://newmexico.2keller.com (New Mexico)
My client questions this strategy from time to time, and I want to see if anyone can offer some reassurance of which I haven't thought.
Our reason for setting up the sites in this manner is to ensure that each site speaks to state-specific practice areas (for instance, New Mexico does nursing home abuse, whereas the other states don't, etc.) and state-specific ethics law (for instance, in some states you can advertise your dollar amount recoveries, and others you can't.) There are so many differences between each state that the content would seem to warrant it.
Local citations and listings are another reason these sites are set up in such a fashion. The firm is a member of several local state directories and memberships, and by having these links go directly to the subdomain they reference, I can see this being another advantage.
Also, inside each state there are separate pages set up for specific cities. We geo-target major cities in each state, and trying to do all of this under one domain for 3 different states would seemingly get very confusing, very quickly.
I had thought of setting up the various state pages through folders on the main domain, but again, there is too much state specific info to make this seem like a logical approach. Granted the linking and content creation would be easier for one site, but I don't think we can accomplish this in a clean way with the offices being in such different locales?
I guess I'm wondering if there are some things I'm overlooking here?
Thanks guys/gals!
-
Crazy, I have quite a bit of experience with this exact scenario: law firms using geo subdomains to target specific areas.
Here's my findings and suggestions based on actual results and experience:- SEO on domain.com benefits atlanta.domain.com. This is a fact. If Starbucks decided to create subdomains tomorrow for every location, their subdomains would benefit from 91 DA. That's how Findlaw, lawyers.com and all those guys get first page placement with high DA and low PA.
- Digital Diameter is right, subdomains are more effective and directories are more efficient. UNLESS you have a really good multi-site CMS. Then you can be equally efficient and more effective.
I hope this answers your question, if you want some help or have any other questions, PM me.
-
Much appreciated... Can you see the reply above I sent to Mike and offer your thoughts?
-
Thanks, Mike. I agree with your reply, but I suppose my main concern is more associated with whether or not our site becomes too convoluted as we begin geo-targeting states and the major cities within them. It would seem to be an organizational nightmare, making sure that users are getting the experience they expect when visiting the site. Users in New Mexico don't care about Indiana law, copy, and vice-versa. There are so many topics related to specific states, and there's so much content, I worry about it becomes haphazzrd when restricted to one domain. Thoughts?
-
Subdomains (more effective):
In short the benefit is that Google will see each subdomain as a locally focused, independent site.
However, this is also the disadvantage of subdomains.
While they are more likely to be seen as locally focused, each subdomain will have to be managed, provided with unique content and links so it can quickly become much more effort.
Folders (more efficient):
Folders offer much more synergy as they are seen as a single site, but they are also seen as less local / independently targets than subdomains.
-
Randal,
I think in this instance first and foremost lets talk about url structure.From an organic search perspective structuring urls in this way (http://michigan.2keller.com) will hinder any positive seo you do on your main url. Google would view your current url structure as individual domains, therefore none of the seo strategy done on 2keller.com will transfer to the other domains.How the url is structured should not have any affect on how your add the content. We deal with national clients with multiple locations all the time. How you want to structure this is http://www.2keller.com/Michigan or http://www.2keller.com/newmexico. This would allow your team to only have to do search marketing work once and would add efficiency's to your work flow.
I know your main concern is the amount of state specific content. You can still create the pages the exact same way as before from a content perspective. Just have a solid internal linking structure on 2keller.com guiding people to the proper relevant pages or you could use geo targeting allowing the site to recognize IP address and auto-direct people to the right area. Hope this helps. Let us know if you have any questions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi - I have a question about IP addresses
- would it hurt link juice to host a blog on a different server to the rest of your website? I have a web host saying they can't run Wordpress as they won't support PHP for "security reasons" - one solution would be to set up Wordpress on a different server and redirect domain.com/blog there (I presume this is do-able?). But I don't know if that affects the SEO adversely?
Technical SEO | | abisti21 -
Subdomain question
Hi guys, I have a subdomain on my site that i want to completely remove from the index. I tried already everything to remove it but it is special situation so the only choice i have left is to remove it from Search Console in "Remove URLs" feature. So my question is: if i remove my root subdomain (example: http://subdomain.mydomain.com/) via "Remove URLs" feature in Webmaster Console, will it remove all the URLs coming from that particular domain as well? I also want to make sure that my root domain will stay untouched and be functioning normally. Thank you for advice!
Technical SEO | | odmsoft0 -
301 redirect question
Hi Everyone When doing 301 redirects for a large site, if a page has 0 inbound links would you still redirect it or just leave it? Im just curious on the best practice for this Thanks in advance
Technical SEO | | TheZenAgency0 -
Migrating domains from a domain that will have new content.
We have a new url. The old url is being taken over by someone else. Is it possible to still have a successful redirect/migration strategy if we are redirect from our old domain, which is now being used by someone else. I see a big mess, but I'm being told we can redirect all the links to our old content (which is now used by someone else) to our new url. Thoughts? craziness? insanity? Or I'm just not getting it:)
Technical SEO | | CC_Dallas0 -
Canonical Expert question!
Hello, I am looking for some help here with an estate agent property web site. I recently finished the MoZ crawling report and noticed that MoZ sees some pages as duplicate, mainly from pages which list properties as page 1,2,3 etc. Here is an example: http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=2
Technical SEO | | artdivision
http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=3 etc etc Now I know that the best practise says I should set a canonical url to this page:
http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=all but here is where my problem is. http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=1 contains good written content (around 750 words) before the listed properties are displayed while the "page=all" page do not have that content, only the properties listed. Also http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=1 is similar with the originally designed landing page http://www.xxxxxxxxx.com/property-for-rent/london/houses I would like yoru advise as to what is the best way to can url this and sort the problem. My original thoughts were to can=url to this page http://www.xxxxxxxxx.com/property-for-rent/london/houses instead of the "page=all" version but your opinion will be highly appreciated.0 -
Sitelink Demotion Question
A non profit in our industry that we support pro bono--the awesome Efficiency First--is finding that individual member listings are sometimes showing up within their sitelinks. This obviously does not sit well with the members who do not show up. I know that you can "demote" specific URL's within webmaster tools, but I'm wondering if it's possible to demote an entire section. Their structure is www.efficiencyfirst.org/member/member#. Do we demote the entire member section? Or is there a more structural problem at play here? Thanks much. 2IZuPkD.png
Technical SEO | | PeterTroast0 -
What do I do with an old site when a new one is built?
I have a customer that has a PR 4 website with over 3000 pages of content. He decided to build a brand new website with a new domain and it now has a PR of 2. Our question is, what do we do with the old site? Do we migrate all the content over to the new site and do redirects on all the pages? Do we keep the old site up and put links over to the new site? He was just planning on shutting it down but that seemed like a complete waste of PR and SEO. What is his best course of action? Thanks for your replies.
Technical SEO | | smartlinksolutions0 -
Getting Google to index new pages
I have a site, called SiteB that has 200 pages of new, unique content. I made a table of contents (TOC) page on SiteB that points to about 50 pages of SiteB content. I would like to get SiteB's TOC page crawled and indexed by Google, as well as all the pages it points to. I submitted the TOC to Pingler 24 hours ago and from the logs I see the Googlebot visited the TOC page but it did not crawl any of the 50 pages that are linked to from the TOC. I do not have a robots.txt file on SiteB. There are no robot meta tags (nofollow, noindex). There are no 'rel=nofollow' attributes on the links. Why would Google crawl the TOC (when I Pinglered it) but not crawl any of the links on that page? One other fact, and I don't know if this matters, but SiteB lives on a subdomain and the URLs contain numbers, like this: http://subdomain.domain.com/category/34404 Yes, I know that the number part is suboptimal from an SEO point of view. I'm working on that, too. But first wanted to figure out why Google isn't crawling the TOC. The site is new and so hasn't been penalized by Google. Thanks for any ideas...
Technical SEO | | scanlin0