SEO for Franchises - Subdomains or Folders?
-
Wondering if there ever has been any recent consensus on best SEO strategy for a Franchise.
I feel it is safe to assume that just having one corporate website with a "store locator" that just brings up the address, phone and hours of a location is not optimal. Yes, the important thing is to get a Google Places for Business listing for each location so you can come up in the 3-pack and regular Maps result, BUT, the rankings for the 3-pack is largely determined by the site's authority and relevance to the specific search term used, IN ADDITION TO, the proximity of the business to the search user's physical location.
Apparently it is widely believed that domain authority does not transfer from www.mycorporatedomain.com to somecity.mycorporatedomain.com.
And of course we also know there is a potential for a duplicate content penalty, so you can't just duplicate your main site for a number of locations and change the address and phone number on the contact page.
If the products and or services are identical for each location, then it's going to be somewhat ridiculous to try and rewrite many sections of the website since the information is no different despite the location.
It seems in general more people are advocates of putting location pages or micro-sites in a subfolder of the corporate domain so that it can benefit from the domain's authority.
HOWEVER, it is also widely known that the home page (root URL) of any domain carries more weight in the eyes of Google.
So let's assume the best strategy is to create a micro-site where phone and address is different anywhere they appear and the contact page is customized to that location, and the "Meet The Staff" page is customized to that location. The site uses the same style 'template' if you will as the main site.
Let's also assume you can build a custom home page that has some different content, but still shares the same look and some of the same information as the main site. But let's say between the different phone, address, and maybe some different images and 20% of the content rewritten a bit, Google doesn't view it as dupe content.
So would the best strategy then be to have the location home page be: somecity.mycorporatedomain.com and the product and services pages that are identical to the main site you just use a rel canonical to point to the main site? Or, do you make the "home page" for the local business be a subfolder of the main site.
So I guess what it boils down to is whether or not the domain authority has more of an effect compared to having a unique home page on a subdomain.
What about this? Say the only thing different on the local site is the contact (phone/address) in the header and/or footer of every page, the contact form page, and the meet the staff page. All other content is identical to the corp site, including the home page. I think in that case you need to use a script to serve the pages dynamically. So you would need to server the pages using a PHP script that detects the subfolder name to determine the location and dynamically replaces the phone and address and server different contact and staff pages. You could have a vanity domain mycity.mycorporatedomain.com that does a 301 redirect to the subfolder home page. (This is all ofcourse assuming the subfolder method is the way to go.)
-
Hey There SEOJaz,
As you mentioned the word 'optimal' my own description of this would be:
-
A single website representing the brand
-
A store locator linking to an excellent set of landing pages representing the physical locations of the brand. These pages feature unique, compelling content and good CTAs.
-
A submenu or sitemap somewhere on the site linking to these landing pages to sure they are indexable.
And that's it.
When we start throwing subdomains, 20% different content and micro-sites into the game, what you typically wind up with is confusion, mistakes, content of less than sterling quality and marketing efforts having to be spread too thin across an ecosystem of properties instead of being poured into a single, very strong, branded website.
What I've described is optimal, but I'm not sure I've answered your questions...
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the SEO effect of schema subtype deprecation? Do I really have to update the subtype if there isn't a suitable alternative?
Could someone please elaborate on the SEO effect of schema subtype deprecation? Does it even matter? The Local business properties section of developers.google.com says to: Define each local business location as a LocalBusiness type. Use the most specific LocalBusiness sub-type possible; for example, Restaurant, DaySpa, HealthClub, and so on. Unfortunately, the ProfessionalService page of schema.org states that ProfessionalService has been deprecated and many of my clients don't fit anywhere else (or if they do it's not a LocalBusiness subtype). I find it inconvenient to have to modify my different clients' JSON-LD from LocalBusiness to ProfessionalService back to LocalBusiness. I'm not saying this happens every day but how does one keep up with it all? I'm really trying to take advantage of the numerous types, attributes, etc., in structured data but I feel the more I implement, the harder it will be to update later (true of many things, of course). I do feel this is important and that a better workflow could be the answer. If you have something that works for you, please let us know. If you think it's not important tell us why not? (Why Google is wrong) I understand there is always a better use of our time, but I'd like to limit the discussion to solving this Google/Schema.org deprecation issue specifically.
Local Website Optimization | | bulletproofsearch0 -
What is the effect of CloudFlare CDN on page load speeds, hosting IP location and the ultimate SEO effect?
Will using a CDN like CloudFlare.com confuse search engines in terms of the location (IP address) of where the site is actually physically hosted especially since CloudFlare distributes the site's content all around the globe? I understand it is important that if customers are mostly in a particular city it makes sense to host on an IP address in the same city for better rankings, all things else being equal? I have a number of city-based sites but does it make having multiple hosting plans in multiple cities/ countries (to be close to customers) become suddenly a ridiculous thing with a CDN? In other words should I just reduce it down to having one hosting plan anywhere and just use the CDN to distribute it? I am really struggling with this concept trying to understand if I should consolidate all my hosting plans under one, or if I should get rid of CloudFlare entirely (can it cause latency in come cases) and create even more locally-based hosting plans (like under site5.com who allow many city hosting plans). I really hope you can help me somehow or point me to an expert who can clarify this confusing conundrum. Of course my overall goal is to have:
Local Website Optimization | | uworlds
1. lowest page load times
2. best UX
3. best rankings I do realise that other concepts are more important for rankings (great content, and links etc.) but assuming that is already in place and every other factor is equal, how can I fine tune the hosting to achieve the desirable goals above? Many thanks!
Mark0 -
SEO Client not rankings in Google
Hello, I have a client that has continued to be problematic for my team and I. They have fair to middling rankings in Yahoo and Bing, but none in Google. I realize that they are three separate search engines each with their own criteria, but this client is the only one experiencing this problem. There is no significant duplicate content that can find, same with restrictions in the robots.txt file. These seems to be no reason why all my tools say that this client has no presence at all in google, especially when the client gains most of their traffic through Google. Can anyone assist me in finding out what is going wrong? Client website for reference: http://www.volvethosp.com/ Best, BeyondIndigo
Local Website Optimization | | BeyondIndigo0 -
SEO geolocation vs subdirectories vs local search vs traffic
My dear community and friends of MOZ, today I have a very interesting question to you all. Although I´ve got my opinion, and Im sure many of you will think the same way, I want to share the following dilemma with you. I have just joined a company as Online Marketing Manager and I have to quickly take a decision about site structure. The site of the company has just applied a big structure change. They used to have their information divided by country (each country one subdirectory) www.site.com/ar/news www.site.com/us/news . They have just changed this and erased the country subdirectory and started using geolocation. So if we go to www.site.com/news although the content is going to be the same for each country ( it’s a Latinamerican site, all the countries speak the same language except Brazil) the navigation links are going to drive you to different pages according to the country where you are located. They believe that having less subdirectories PA or PR is going to be higher for each page due to less linkjuice leaking. My guess is that if you want to have an important organic traffic presence you should A) get a TLD for the country you want to targe… if not B)have a subdirectory or subdomain for each country in your site. I don’t know what local sign could be a page giving to google if the URL and html doesn’t change between countries- We can not use schemas or rich formats neither…So, again, I would suggest to go back to the previous structure. On the other hand…I ve been taking a look to sensacine.com and although their site is pointing only to Spain | |
Local Website Optimization | | facupp1
| | |
| | |
| | |
| | |
| | |
| | |
| | | They have very good rankings for big volume keywords in all latinamerica, so I just want to quantify this change, since I will be sending to the designers and developers a lot of work1 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Local SEO question
Hi I was wondering is there any specific rules for Local SEO for a service company which provides a service in a variety of cities but only has one physical location. For example is it ok to target the other cities in Title Tags or would this be frowned upon? Regards
Local Website Optimization | | TheZenAgency0 -
How can I fully take advantage of press coverage to aid my SEO efforts?
I run the digital marketing for a local start-up that's ranking for groups of semi-related keywords. We've been around for about 6 months in beta and have recently (a few days ago) done our official launch. We're starting to get some coverage in local media and I've tried my best to ensure that links to our site are included with a good range of keywords. What else can I do to fully take advantage of the press coverage that will be coming our way?
Local Website Optimization | | NgEF0 -
Same blog, multiple languages. Got SEO concerns.
Hi, My company runs a small blog in swedish. Most of the visitors are our customers/prospects. We will write about generic concepts regarding our business and the occasional company news story. However, I have quite a few ideas for articles that could be interesting to a lot of people, and I'm tempted to write those in english for better exposure. I would love it if that exposure could boost my companies authority. How should I go on about this? Can I somehow tell search engines that a certain part or page of the site is in another language? Should I translate our entire site to english and post the english post in a separate blog feed? Any insight is welcome. Thanks in advance!
Local Website Optimization | | Mest0