Best practice for franchise sites with duplicated content
-
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues.
All sites are hosted on the same server therefor the same IP address
All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate
Almost all sites have the same design (A few of the groups we work with have multiple design options)
Any suggestions would be greatly appreciated.
Thanks Again
Aaron
-
I fully agree. We have notified them all and let them know its in their best interest to modify the content throughout the site. Unfortunately most of them don't and the copy remains templated.
Thanks for your answers
-
If the search is for Company product or service, You can take little advantage by doing local listing of each franchisee. Except This content rewriting is only option as per my knol.
-
Maybe part of the literature describing your program can include the point that to be really effective the franchisee's will have to write their own content. It all depends on your business model, whether you want to make them aware that they have 100-5,000 competitors from your company alone.
-
I fully agree with you EGOL "There is another problem - maybe bigger than Google's desire for unique content."
We give each franchisee the opportunity to expand on the content and make it their own, however I would say 90% of them don't make any changes.
I don't think that either the franchisee or corporate would want to pay the $$$ it would cost to have our Copywriters write unique copy for each site. (50-100+ products/services) per site or franchisee.
-
I wish we could redo the strategy but we aren't talking about small franchises here. We are talking franchises anywhere from 100 stores all the way up to 5,000 stores.
The products/services they offer are described very well and unfortunately the only thing we add into each product page is maybe a few location identifiers and a company name.
I don't want to use the canonical solution because each site has to be seen as a stand along site.
-
Each Franchise has their own domain.
Each Product/Service has a single description - Each franchisee has to use the same corporate approved logo.
All Images are named the same thing so it can matter.
I like your suggestions though...you are going the same route we have in the past.
-
Information about Google using OCR... Use this link to see an example of how google extracted and highlighted "wrigley swim" from this newspaper scan.
Google can determine the color of an image... image files are actually characters and google can extract the colors. If you go into image search there is an option limit the results by color. Some of that is done via context (such as words in the file name or words near the image), however, some is done by extracting data from the image file.
-
Here we are all giving advices based on their own knowledge. So i personally think Google cannot read images or what a specific image relates to. If I'm wrong and I hope I'm not ... can i get more details EGOL
Thanks.
-
...Google cannot read images or colors...
Are you willing to bet a month's pay on that?
-
I want to make sure that Google sees each one of these sites as unique sites...
I don't think that there is an inexpensive way to get this done and have high quality results. If you want unique content you gotta pay the price... but you could consider.
Hire several writers to reauthor the content - will cost a lot less than starting from scratch.
Get an article spinner program - that will be cheap but you will probably not like the results.
Make a enthusiastic sales pitch to each franchisee with incentives to write their own content.
...templated content approved by corporate...
There is another problem - maybe bigger than Google's desire for unique content.
Good luck.
-
You may want to re-think your strategy of franchising the product and the content. If the content is the same the only way to eliminate the duplicate content problem is to point to one of them as the canonical version, and that would very much impact the performance of the other versions of the other sites.
-
I suggest your give the products (franchise) use their own sort of domain(logo) but add franchise [your logo].
1. Their Own Domain
2. Their own product description even if it's the same product (maybe add your logo to make sure people recognizes the brand.
3. Design does not matter (urls, title, description, content etc counts) as Google cannot read images or colors
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does javascript generated content consider as regular content?
The website mentioned below, the content is generated using javascript, and content is something to do with Unicode char. The Unicode content creates as you scroll down. Will this content affect SEO https://www.myweirdtext.com/
On-Page Optimization | | teenmass423230 -
Responsive site.com vs m.site.com
Hi All, My client's website have two urls like: site.com/a.html and **m.site.com/a.html. ** Will it hurt google rankings for this website because there are version of a website? Please help!
On-Page Optimization | | binhlai1 -
How best to deal with internal duplicate content
hi having an issue with a client site and internal duplicate content. The client has a custom cms and when they post new content it can appear, in full, at two different urls on the site. Short of getting the client to move cms, which they won't do, I am trying to find an easy fix that they could do themselves. ideally they would add a canonical on one of the versions but the cms does allow them to view posts in html view, also would be a lot if messing about wth posting the page and then going back to the cms and adding the tag. the cms is unable to auto generate this either. The content editors are copywriters not programmers. Would there be a solution using wmt for this? They have the skill level to be able to add a url in wmt so im thinking that a stop gap solution could be to noindex one of the versions using the option in webmaster tools. Ongoing we will consult developers about modifying the cms but budgets are limited so looking for a cheap and quick solution to help until the new year. anyone know of a way other than wmt to block Google from seeing duplicate content. We can block Google from folders because only a small percentage of the content in the folder would be internally duplicate. would be very grateful for any suggestions anyone could offer. thanks.
On-Page Optimization | | daedriccarl0 -
Duplicate Issue
Hello Mozzers! We have a client going through a website revamp. The client is The Michelangelo Hotel, and they are part of Star Hotels. Star Hotels plans to create a section on their site for The Michelangelo, as opposed to maintaining a stand alone site. They will then take the michelangelohotel.com domain, and point it to the corresponding pages on the Star site. The guest will key in www.michelangelohotel.com, and will see the same content that can be found on www.starhotel.com/en/michelangelo-hotel-new-york. The problem we have is this: Essentially the same content will be indexed twice, once on starhotels.com and once on michelangelohotel.com. This would seem to cause a duplicate content issue. What are your thoughts? Edit: I apologize, because I was not nearly clear enough here. The Star Hotels site will have 5 pages dedicated to The Michelangelo Hotel. The content will sit solely on that server as those 5 pages. Those 5 pages will each be indexed as 2 URLs. www.michelangelohotel.com <-> www.starhotels.com/en/michelangelo/ www.michelangelohotel.com/accommodations <-> www.starhotels.com/en/michelangelo/accommodations And so on. Thanks!
On-Page Optimization | | FrankSweeney0 -
"Turning off" content to a site
One site I manage has a lot of low quality content. We are in the process of improving the overall site content but we have "turned off" a large portion of our content by setting 2/3 of the posts to draft. Has anyone done this before or had experience with doing something similar? This quote from Bruce Clay comes to mind: “Where a lot of people don’t understand content factoring to this is having 100 great pages and 100 terrible pages—they average, when the quality being viewed is your website,” he explained. “So, it isn’t enough to have 100 great pages if you still have 100 terrible ones, and if you add another 100 great pages, you still have the 100 terrible ones dragging down your average. In some cases we have found that it’s much better, to improve your ranking, to actually remove or rewrite the terrible ones than add more good ones.” What are your thoughts? Thanks
On-Page Optimization | | ThridHour0 -
One site, one location, multiple languages - best approach?
Hey folks, Has anyone created a multilingual site targeted at a single location? I have a site that I need to create which is targeting users in Spain. There are going to need to be English and Spanish versions of the text. My thoughts would be to handle it this way: 1. Geolocate the entire site to spain 2. Have the english content in a folder /en/ 3. Have the spanish content in a folder /es/ As far as I am aware the same content in another language is not considered duplicate content and Google should handle folks searching in spanish or english and show them the correct landing page. Sounds easy enough in principle but I also have these other options to seemingly solidify the approach: 4. Add: rel="alternate" hreflang="x" (3) 5. Add language information to a sitemap (4) Again, none of that seems terribly difficult but would welcome any feedback and particularly experience of multilingual sites targeting a single location. Thanks all Marcus References and info 1. Multi Regional:
On-Page Optimization | | Marcus_Miller
http://googlewebmastercentral.blogspot.co.uk/2010/03/working-with-multi-regional-websites.html 2. Multi Language:
http://googlewebmastercentral.blogspot.co.uk/2008/08/how-to-start-multilingual-site.html 3. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 4. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=26208650 -
Duplicate content and the Moz bot
Hi Does our little friend at SEOmoz follow the same rules as the search engine bots when he crawls my site? He has sent thousands of errors back to me with duplicate content issues, but I thought I had removed these with nofollow etc. Can you advise please.
On-Page Optimization | | JamieHibbert0 -
Magento Layered Navigation & Duplicate Content
Hello Dear SeoMoz, I would like to ask your help with something that I am not sure off. Our ecommerce web site is built with Magento. I have found many problems so far and I know that there will be many more in the future. Currently, I am trying to find the best way to deal with the duplicate content that is produced from the layered navigation (size, gender etc). I have done a lot of research so far in order to understand which might be the best practice and I found the following practices: **Block layered navigation URLSs from the Google Webmaster Tools (**Apparently this works for Google Only). Block these URLs with the robots.txt file Make links no-follow **Make links JavaScript from Magento *** Avoid including these links in the xml site map. Avoid including these link in the A-Z Product Index. Canonical tag Meta Tags (noindex, nofollow) Question If I turn the layered navigation links into JavaScript links from the Magento Admin, the layered navigation links are still found by the crawlers but they look like that: | http://www.mysite.com/# instead of: http://www.mysite.com/girls-basics.html?gender_filte... | Can these new URLS (http://www.mysite.com/# ) solve the duplicate content problems with the layered navigation or do I need to implement other practices too to make sure that everything is done right. Kind Regards Stefanos Anastasiadis
On-Page Optimization | | alexandalexaseo0