Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to Handle Franchise Duplicate Content
-
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns.
- Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to)
- Should we just continue to use robots.txt/noindex for all duplicate pages for now?
- Any other recommendations?
Thanks in advance!
-
It sounds like you are already doing as well as you can - since there's no clear canonical page, noindexing the duplicate pages would probably be the way to go. Don't panic if you see some duplicate pages still sneak into the index after you've noindexed them; this is common and it's unlikely that Google will see this as a Panda-worthy problem on your part.
The one drawback to noindexing the pages is that when unique content is up on them, and they are ready to be indexed, it may take a while for Google to get the message that this page is supposed to be indexed now. I've seen it take anywhere from an hour to a week for a page to appear in the index. One thing you can do in the meantime is make sure each site is accruing some good links - not an easy task with 80 websites, I know, but the higher authority will help out once the unique content is ready to go. Sounds like a herculean task - good luck!
-
Solid insight, but unfortunately we do have the 80 websites because the owners of the store manage each separately. Some stores offer different products or services than others and are completely separate entities. Each store owner that we work with is an individual client; we do not work with corporate. Plus, since we don't do marketing for ALL stores in the entire franchise, just a large chunk of them, one big site just wouldn't work. Also, it's really not possible for us to make all these store owners write their own content for the entire site.
We really appreciate your thought on this and totally agree with your logic, but unfortunately would not be able to implement either solution. Right now, we just need some kind of bandaid solution to utilize as we work through rewriting the most important pages on the site (probably either de-indexing them or some kind of canonical strategy).
Thanks!
-
Hey There!
Important question ... why does the company have 80 websites? Are they being individually managed by the owner of each store, or are they all in the control of the central company?
If the latter, what you are describing is a strong illustration supporting the typical advice that it is generally better to build 1 powerhouse website for your brand than a large number of thin, weak, duplicative sites.
If this company was my client, I would be earnestly urging them to consolidate everything into a single site. If they are currently investing in maintaining 80 website, there's reason to hope that they've got the funding to develop a strong, unique landing page for each of the 80 locations on their main corporate website, and redirect the old sites to the central one. Check out how REI.com surfaces unique pages for all of their locations. It's inspiring how they've made each page unique. If your client could take a similar approach, they'd be on a better road for the future.
You would, of course, need to update all citations to point to the landing pages once you had developed them.
If, however, the 80 websites are being controlled by 80 different franchise location managers, what needs to be developed here is a policy that prevents these managers from taking the content of the corporation. If they want to each run a separate website, they need to take on the responsibility of creating their own content. And, of course, the corporate website needs to be sure it doesn't have internal duplicate content and is not taking content from its franchise managers, either. 80 separate websites should = 80 totally separate efforts. That's a lot to have going on, pointing back to the preferred method of consolidation wherever possible.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate LocalBusiness Schema Markup
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information. Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google. Do the pros of having more detailed markup outweigh that potential negative impact?
Local Website Optimization | | GoogleAlgoServant0 -
Should I avoid duplicate url keywords?
I'm curious to know Can having a keyword repeat in the URL cause any penalties ? For example xyzroofing.com xyzroofing.com/commercial-roofing xyzroofing.com/roofing-repairs My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way. Also One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
Local Website Optimization | | Lyontups0 -
Should Multi Location Businesses "Local Content Silo" Their Services Pages?
I manage a site for a medical practice that has two locations. We already have a location page for each office location and we have the NAP for both locations in the footer of every page. I'm considering making a change to the structure of the site to help it rank better for individual services at each of the two locations, which I think will help pages rank in their specific locales by having the city name in the URL. However, I'm concerned about diluting the domain authority that gets passed to the pages by moving them deeper in the site's structure. For instance, the services URLs are currently structured like this: www.domain.com/services/teeth-whitening (where the service is offered in each of the two locations) Would it make sense to move to a structure more like www.domain.com/city1name/teeth-whitening www.domain.com/city2name/teeth-whitening Does anyone have insight from dealing with multi-location brands on the best way to go about this?
Local Website Optimization | | formandfunctionagency1 -
How many SEO clients do you handle?
I work in a small web & design agency who started offering SEO 2 yrs ago as it made sense due to them building websites. There have been 2 previous people to me and I now work there 3 days a week and they also have a junior who knew nothing before she started working for us. She mainly works for me. My question is, how many clients do you think would be reasonable to work on? We currently have around 55 and I have been working there for nearly 5 months now and haven't even got to half of the sites to do some work on. I've told them the client list is way too big and we should only have around 15 clients max. However they don't want to lose the money from the already paying clients so won't get rid of any and keep adding new ones Their systems were a mess and had no reporting or useful software so I had to investiagte and deploy that, along with project management software. Their analytics is also a mess and have employed a contractor to help sort that out too. It's like they were offering SEO services but had no idea or structure to what they did. Meta descriptions were cherry picked which ones to be done, so say 50/60 on a site not filled in. So it's not like I have 45 or so well maintained accounts. They're all a mess. Then the latest 10 new ones are all new sites so All need a lot of work. I'm starting to feel incredibly overwhelmed and oppressed by it all and wanted to see what other SEO professionals thought about it. Any thoughts would be appreciated.
Local Website Optimization | | hanamck0 -
Using geolocation for dynamic content - what's the best practice for SEO?
Hello We sell a product globally but I want to use different keywords to describe the product based on location. For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name). What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches). I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5. I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada. I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript. Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though. Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck. Thank you so much! Laura Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
Local Website Optimization | | LauraFalls0 -
Content writing for single entity business (The use of I)
Most of my clients consist of single entity law firms in which my clients repeatedly use the pronoun "I" to describe every service they provide. I have always preferred using the business name The Law Office of..." put lawyer name here". Is it ok to repetitively use the pronoun "I" in the content. To me it feels lack luster and childish not very professional, however I have a hard time convincing the lawyers of this. What are your thoughts? Can good content be written with the repetitive use of "I"? If not is the business name sufficient or maybe another pronoun? I will be showing responses to my clients if that is ok.
Local Website Optimization | | donsilvernail0 -
How does duplicate content work when creating location specific pages?
In a bid to improve the visibility of my site on the Google SERP's, I am creating landing pages that were initially going to be used in some online advertising. I then thought it might be a good idea to improve the content on the pages so that they would perform better in localised searches. So I have a landing page designed specifically to promote what my business can do, and funnel the user in to requesting a quote from us. The main keyword phrase I am using is "website design london", and I will be creating a few more such as "website design birmingham", "website design leeds". The only thing that I've changed at the moment across all these pages is the location name, I haven't touched any of the USP's or the testimonial that I use. However, in both cases "website design XXX" doesn't show up in any of the USP's or testimonial. So my question is that when I have these pages built, and they're indexed, will I be penalised for this tactic?
Local Website Optimization | | mickburkesnr0 -
Duplicate content on a proxy site?
I have a local client with a 500 page site.
Local Website Optimization | | TFinder
They advertise online and use traditional media like direct mail.
A print media company, Valpak, has started a website
And wants the client to use their trackable phone number
And a proxy website. When I type the proxy domain in the browser
It appears to be client home page at this proxy URL. The vendor
Wishes to track activity on its site to prove their value or something
My question is: is their any "authority" risk to my clients website
By allowing this proxy site??0