Subdomain question for law firm in Indiana, Michigan, and New Mexico.
-
Hi Gang,
Our law firm has offices in the states of Indiana, Michigan, and New Mexico. Each state is governed by unique laws, and each state has its own "flavor," etc.
We currently are set up with the main site as:
http://www.2keller.com (Indiana)
Subdomains as:
http://michigan.2keller.com (Michigan)
http://newmexico.2keller.com (New Mexico)
My client questions this strategy from time to time, and I want to see if anyone can offer some reassurance of which I haven't thought.
Our reason for setting up the sites in this manner is to ensure that each site speaks to state-specific practice areas (for instance, New Mexico does nursing home abuse, whereas the other states don't, etc.) and state-specific ethics law (for instance, in some states you can advertise your dollar amount recoveries, and others you can't.) There are so many differences between each state that the content would seem to warrant it.
Local citations and listings are another reason these sites are set up in such a fashion. The firm is a member of several local state directories and memberships, and by having these links go directly to the subdomain they reference, I can see this being another advantage.
Also, inside each state there are separate pages set up for specific cities. We geo-target major cities in each state, and trying to do all of this under one domain for 3 different states would seemingly get very confusing, very quickly.
I had thought of setting up the various state pages through folders on the main domain, but again, there is too much state specific info to make this seem like a logical approach. Granted the linking and content creation would be easier for one site, but I don't think we can accomplish this in a clean way with the offices being in such different locales?
I guess I'm wondering if there are some things I'm overlooking here?
Thanks guys/gals!
-
Crazy, I have quite a bit of experience with this exact scenario: law firms using geo subdomains to target specific areas.
Here's my findings and suggestions based on actual results and experience:- SEO on domain.com benefits atlanta.domain.com. This is a fact. If Starbucks decided to create subdomains tomorrow for every location, their subdomains would benefit from 91 DA. That's how Findlaw, lawyers.com and all those guys get first page placement with high DA and low PA.
- Digital Diameter is right, subdomains are more effective and directories are more efficient. UNLESS you have a really good multi-site CMS. Then you can be equally efficient and more effective.
I hope this answers your question, if you want some help or have any other questions, PM me.
-
Much appreciated... Can you see the reply above I sent to Mike and offer your thoughts?
-
Thanks, Mike. I agree with your reply, but I suppose my main concern is more associated with whether or not our site becomes too convoluted as we begin geo-targeting states and the major cities within them. It would seem to be an organizational nightmare, making sure that users are getting the experience they expect when visiting the site. Users in New Mexico don't care about Indiana law, copy, and vice-versa. There are so many topics related to specific states, and there's so much content, I worry about it becomes haphazzrd when restricted to one domain. Thoughts?
-
Subdomains (more effective):
In short the benefit is that Google will see each subdomain as a locally focused, independent site.
However, this is also the disadvantage of subdomains.
While they are more likely to be seen as locally focused, each subdomain will have to be managed, provided with unique content and links so it can quickly become much more effort.
Folders (more efficient):
Folders offer much more synergy as they are seen as a single site, but they are also seen as less local / independently targets than subdomains.
-
Randal,
I think in this instance first and foremost lets talk about url structure.From an organic search perspective structuring urls in this way (http://michigan.2keller.com) will hinder any positive seo you do on your main url. Google would view your current url structure as individual domains, therefore none of the seo strategy done on 2keller.com will transfer to the other domains.How the url is structured should not have any affect on how your add the content. We deal with national clients with multiple locations all the time. How you want to structure this is http://www.2keller.com/Michigan or http://www.2keller.com/newmexico. This would allow your team to only have to do search marketing work once and would add efficiency's to your work flow.
I know your main concern is the amount of state specific content. You can still create the pages the exact same way as before from a content perspective. Just have a solid internal linking structure on 2keller.com guiding people to the proper relevant pages or you could use geo targeting allowing the site to recognize IP address and auto-direct people to the right area. Hope this helps. Let us know if you have any questions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
General SSL Questions After Move
Hello, We have moved our site to https, Google Analytics seems to be tracking correctly. However, I have seen some conflicting information, should I create a new view in analytics? Additionally, should I also create a new https property in Google search console and set it as the preferred domain? If so, should I keep the old sitemap for my http property while updating the sitemap to https only for the https property? Thirdly, should I create a new property as well as new sitemaps in Bing webmaster? Finally, after doing a crawl on our http domain which has a 301 to https, the crawl stopped after the redirect, is this a result of using a free crawling tool or will bots not be able to crawl my site after this redirect? Thanks for all the help in advance, I know there are a lot of questions here.
Technical SEO | | Tom3_150 -
I have consolidated my subdomains into subfolders; should i remove the subdomains also?
Hi, I have consolidated my website's many sub-domains into sub-folders. We had ~20 sub-domains. The sub-domains and the root domain shared the same code-base and sub-domain specific features were controlled using Apache's SetEnv directive in the httpd.conf includes. Now they are consolidated. I have redirected all of the sub-domains to the relevant sub-folder; so http://sub1.mysite.com now 301 redirects to http://www.mysite.com/sub1.The redirects happen in the htaccess file and all sub-domains and the root are still pointing to the same code-base. However, the Moz campaign tracker still occasionally tells me that i have optimisation opportunities at http://sub1.mysite.com and in both Bing and Google webmaster tools traffic to those sub-domains is mentioned. Should i delete the subdomains? Cheers
Technical SEO | | McCaldin0 -
New pages need to be crawled & indexed
Hi there, When you add pages to a site, do you need to re-generate an XML site map and re-submit to Google/Bing? I see the option in Google Webmaster Tools under the "fetch as Google tool" to submit individual pages for indexing, which I am doing right now. Thanks,
Technical SEO | | SSFCU
Sarah0 -
Questionable Referral Traffic
Hey SEOMozers, I'm working with a client that has a suspicious traffic pattern going on. In October, a referral domain called profitclicking.com started passing visits to the site. Almost, in parallel the overall visits decreased anywhere from 35 to 50%. After checking out profitclicking.com more, it promises more traffic "with no SEO knowledge". The client doesn't think that this service was signed up for internally. Regardless, it obviously smells pretty fishy, and I'm searching for a way I can disallow traffic from this site. Could I simply just write a simple disallow statement in the robots.txt and be done with it? Just wanted to see if anyone else had any other ideas before recommending a solution. Thanks!
Technical SEO | | kylehungate0 -
Updating content on URL or new URL
High Mozzers, We are an event organisation. Every year we produce like 350 events. All the events are on our website. A lot of these events are held every year. So i have an URL like www.domainname.nl/eventname So what would you do. This URL has some inbound links, some social mentions and so on. SO if the event will be held again in 2013. Would it be better to update the content on this URL or create a new one. I would keep this URL and update it because of the linkvalue and it is allready indexed and ranking for the desired keyword for that event. Cheers, Ruud
Technical SEO | | RuudHeijnen0 -
Summarize your question.Sitemap blocking or not blocking that is the question?
Hi from wet & overcast wetherby UK 😞 Ones question is this... " Is the sitemap plus boxes blocking bots ie they cant pass on this page http://www.langleys.com/Site-Map.aspx " Its just the + boxes that concern me, i remeber reading somewherte javascript nav can be toxic. Is there a way to test javascript nav set ups and see if they block bots or not? Thanks in advance 🙂
Technical SEO | | Nightwing0 -
Are there any tools that will detect when new pages are added?
need help with keeping track of when new pages pop up that might compete internally with already optimized pages.
Technical SEO | | SEOmoxy0 -
Duplicate content question with PDF
Hi, I manage a property listing website which was recently revamped, but which has some on-site optimization weaknesses and issues. For each property listing like http://www.selectcaribbean.com/property/147.html there is an equivalent PDF version spidered by google. The page looks like this http://www.selectcaribbean.com/pdf1.php?pid=147 my question is: Can this create a duplicate content penalty? If yes, should I ban these pages from being spidered by google in the robots.txt or should I make these link nofollow?
Technical SEO | | multilang0