Need sitemap opinion on large franchise network with thousands of subdomains
-
Working on a large franchise network with thousands of subdomains. There is the primary corporate domain which basically directs traffic to store locators and then to individual locations. The stores sell essentially the same products with some variations on pricing so lots of pages with the same product descriptions.
Different content
- All the subdomains have their location information address info in the header, footer and geo meta tags on every page.
- Page titles customized with franchise store id numbers.
Duplicate content
- Product description blocks.
Franchisee domains will likely have the ability to add their own content in the future but as of right now most of the content short of the blocks on the pages are duplicated.
Likely limitations -- Adding City to page titles will likely be problematic as there could be multiple franchises in the same city.
Ideally it would be nice if users could search for the store or product and have centers return that are closest to them.
We can turn on sitemaps on all the subdomains and try to submit them to the search engines. Looking for insight regarding submitting all these sites or just focusing on the main domain that has a lot less content on it.
-
Ideally yes, you would have a separate XML Sitemap file and Google Search Console profile, as well as a separate (and combined) Google Analytics view for each of the thousands of subdomains.
If that is not possible, at the very least you should have a sitemap that includes every indexable page on the primary site, as well as an XML sitemap on each subdomain, which is linked to in that subdomain's own Robots.txt file.
According to the XML sitemaps protocol, XML sitemap files can not contain URLs from different domains. This includes subdomains. You have to keep all URLs to a single domain per XML sitemap. See sitemaps explained for more info.
As for the Title Tag thing, I don't see why you couldn't have a street name AND city -- or neighborhood and city -- so that they would all be unique, while still including the city.
-
Would I have to validate all xx thousands of subdomains in webmaster tools first?
-
Sounds a very tricky one!
You could look to use an index-sitemap.xml so you can list all of your separate subdomain-sitemaps. that way you'll only need to submit one single sitemap.
I would recommend looking at Screaming Frog to help you do this.
Good luck!
-
There is a main site with the store locator that points to an individual franchise location, however nothing can be purchased from that site.
The franchise is very local oriented, that is why every site has a subdomain. Every Franchise can set their own pricing and product options.
-
Hey There!
This does sound like a complex scenario - even a bit of a messy one. Ideally, what the brand would have done here was to build a single website with a single store locator taking users to an appropriate landing page based on the city or zip they type in. The single website would feature a single product menu, accessible to all users regardless of city, removing any risk of creating duplicate product description pages. Something along the lines of how REI.com handles their web presence (you might like to show that to the client).
Instead of taking this approach, am I right in understanding that your client got into this thousands-of-subdomains predicaments in order to provide single-user access to a specific franchisee in a specific city, while not allowing him access to the entire website? Or, for some other reason?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Checking subdomains/ site structure of a website for International SEO
Dear Moz community, I am looking into two websites for a friend and we want to understand the following: What is the site structure as per the sub domains? e.g currently it is .com/en/ or .com/ru/ or .com/zh/ Using the crawl report, each page has a en or other language version. I take it this means that we have to create copy, meta titles and descriptions for each of the languages even if the page is the same but in a different language? To avoid duplication of content would you suggest canonical tags to be put in place? To check hreflang mark up, I couldn't find anything in the code which makes me thing a script is automatically translating this? This is the first time I have started to look at international SEO and want to understand what to look for in an audit of exisiting sites. Thank you,
Local Website Optimization | | TAT1000 -
SEO for Franchises - Subdomains or Folders?
Wondering if there ever has been any recent consensus on best SEO strategy for a Franchise. I feel it is safe to assume that just having one corporate website with a "store locator" that just brings up the address, phone and hours of a location is not optimal. Yes, the important thing is to get a Google Places for Business listing for each location so you can come up in the 3-pack and regular Maps result, BUT, the rankings for the 3-pack is largely determined by the site's authority and relevance to the specific search term used, IN ADDITION TO, the proximity of the business to the search user's physical location. Apparently it is widely believed that domain authority does not transfer from www.mycorporatedomain.com to somecity.mycorporatedomain.com. And of course we also know there is a potential for a duplicate content penalty, so you can't just duplicate your main site for a number of locations and change the address and phone number on the contact page. If the products and or services are identical for each location, then it's going to be somewhat ridiculous to try and rewrite many sections of the website since the information is no different despite the location. It seems in general more people are advocates of putting location pages or micro-sites in a subfolder of the corporate domain so that it can benefit from the domain's authority. HOWEVER, it is also widely known that the home page (root URL) of any domain carries more weight in the eyes of Google. So let's assume the best strategy is to create a micro-site where phone and address is different anywhere they appear and the contact page is customized to that location, and the "Meet The Staff" page is customized to that location. The site uses the same style 'template' if you will as the main site. Let's also assume you can build a custom home page that has some different content, but still shares the same look and some of the same information as the main site. But let's say between the different phone, address, and maybe some different images and 20% of the content rewritten a bit, Google doesn't view it as dupe content. So would the best strategy then be to have the location home page be: somecity.mycorporatedomain.com and the product and services pages that are identical to the main site you just use a rel canonical to point to the main site? Or, do you make the "home page" for the local business be a subfolder of the main site. So I guess what it boils down to is whether or not the domain authority has more of an effect compared to having a unique home page on a subdomain. What about this? Say the only thing different on the local site is the contact (phone/address) in the header and/or footer of every page, the contact form page, and the meet the staff page. All other content is identical to the corp site, including the home page. I think in that case you need to use a script to serve the pages dynamically. So you would need to server the pages using a PHP script that detects the subfolder name to determine the location and dynamically replaces the phone and address and server different contact and staff pages. You could have a vanity domain mycity.mycorporatedomain.com that does a 301 redirect to the subfolder home page. (This is all ofcourse assuming the subfolder method is the way to go.)
Local Website Optimization | | SeoJaz0 -
Static XML Sitemap
I performed a change of address for one of our sites to go to a new domain. In the process we left out the submission of the old site's sitemap at the new property in google webmaster console and realize now that we need to do this step. The old site has all these domains still getting indexed: https://www.google.com/#q=site:citychurchfamily.org . I believe that I should be making a static xml sitemap file, upload it to the new domain's root directory, and then test/submit it to google on the new domain's GWM property. Question, should the xml sitemap contain entries for all the old domain's links that are currently still being indexed and what is the fastest way to generate this sitemap? Any insight is greatly appreciated.
Local Website Optimization | | a_toohill1 -
What is a good "white hat" content distribution network for link building?
I am helping a client with Local SEO efforts who has hundreds of blog posts (they have been doing 5 a week for the last 3 years) that contain full length articles about their industry. The client's website itself has been very well optimized for all regards (CRO, Mobile, download speed, citations). However they have very weak domain authority compared to their competitors. I am looking for a bona fide content distribution network I could use to promote my client's blog posts/articles. I have used Linkvana in the past but I have become wary of them after the penguin update. I also had functionality problems using their interface. Are their any bona fide content/article distribution networks out there? Thanks
Local Website Optimization | | RosemaryB0 -
Multiple Websites for a Large Home Service Company
I have a client who offers multiple services, the current website is already huge because they have added on so many new offerings in the last year and want everything above the fold. As I am building out the sitemap for a re-design, they continue to add more services. (HVAC, Plumbing, Solar, Windows, Electrical) I am working on a sitemap for a re-build, but I am still well over 100 pages deep with huge menu's. **My question is what are the SEO pros/cons of breaking the site up into multiple websites? **
Local Website Optimization | | Lauren_E2 -
What's your opinion on stores with multiple locations around the country that sell the same products?
Is there a way to capture local SEO traffic by only having one website/page for our product pages or do we have to have a website for each location even though the content is identical? We do have a location finder where we list each location. But we want to generate local traffic in the cities we are in to our product pages through SEO, but it's difficult because they all sell the exact same product. We know Google doesn't like duplicate content.
Local Website Optimization | | GrowBrilliant0 -
Need advice on direction to go with site
I am taking over this site and redoing it all over. I believe that google may have penalized the site because the site doesn't show up in the SERPS, but will show under a google search (site:prosplumbingsanjoseca.com). I am just asking for your opinions on what I should do to correct the issues with this site and get back into the SERPS.
Local Website Optimization | | mikezaiss0 -
Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname. The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100. A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory. **The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization. What should we do? What have I missed?
Local Website Optimization | | scottclark0