Multiple cities/regions websites - duplicate content?
-
We're about to launch a second site for a different, neighbouring city in which we are going to setup a marketing campaign to target sales in that city (which will also have a separate office there as well).
We are going to have it under the same company name, but different domain name and we're going to do our best to re-write the text content as much as possible. We want to avoid Google seeing this as a duplicate site in any way, but what about:
- the business name
- the toll free number (which we would like to have same on both sites)
- the graphics/image files (which we would like to have the same on both sites)
- site structure, coding styles, other "forensic" items
- anything I might not be thinking of...
How are we best to proceed with this? What about cross-linking the sites?
-
Thanks for coming back to this thread with more info. I am always eager to hear about local business decision making because there are numerous approaches. I sincerely hope this works out well for you. It's a big effort to branch out. I like this reason you've listed: " Additionally, we use the city name in anchor text in backlinking. " Sounds good to me. And that's good that you've got the local area code phone numbers. So important! Good luck, Web Design Barrie! Miriam
-
We have a couple of reasons for doing multiple sites. The primary reason is that we feel that we will have better optimization for that city, as the current site has been optimized for the city that it is in (the keyword being the city). I feel that if we were to optimize the on-page content for the main site for another city that we would be diluting that city's keyword. Additionally, we use the city name in anchor text in backlinking.
The second reason is more of a business strategy reason, as we want the second office to be a separate entity open for sale as a JV opportunity for a managing partner, or possibly even franchise.
Both sites have a local telephone number.
-
Hi Web Design Barrie,
This is an interesting scenario. Most typically, when a business opens a second office that essentially offers the same services in a neighboring city, the new content/new location will simply be added to the existent site. A new landing page for the new city will be developed, the new NAP will be added to the footer, and new content will be added over time featuring work in the new city. A new Google Place page and other new local business index listings will be created for the new physical location. So long as you aren't dealing with a ton of branches of the business, this process works just fine.
You are going a different route, building a completely different website for the new branch. So, yes, because you are offering the same services at second location, you are thinking correctly about the need to rewrite all content so it's not a duplicate of site #1.
If your businesses are local, you shouldn't be using an 800 number. If you have to, put it in an image. Local search hangs, in part, on your local area code phone numbers and these should be totally distinct for the 2 different offices.
Don't worry about the code. Millions of sites share the same code.
The images are also not a big worry...but if the two offices are so identical that you can use identical imagery for them, it again makes me wonder about the decision to build a separate site. What about shots of your unique building, staff and other features at location #2.
Finally, the big thing you're going to want to attend to is that all Local SEO is handled expertly as you roll out the new business. What you're trying to avoid is Google becoming confused about the 2 businesses and merging their data. *See again my point about the phone numbers. Make sure good local SEO hooks are part of both sites, consider using hCard or Schema and take care to be error free when you create your Google Place Page and other listings.
It should be fine if these things go well. Out of my own curiosity as a Local SEO...I would really like to know about the process then led to you deciding to build a second site. I can think of certain instances in which this would be the best choice, but more often than not, going with a single site makes sense. If you have the time, I'd like to know how you decided to go this other route.
Hope my response is helpful!
Miriam
-
The cross linking sites - absolutely - they refer to the same industry so that's great.
With regards to all these other things I wouldn't be worry that much.
Try to think of it as if you bought a website template from some template library - chances are, many other sites are using the same template. You can even take under consideration frameworks and complete systems like for instance Wordpress - the same scenario.
Company name is definitely not going to be of a problem either - neither will telephone number.
We have a few websites promoting our business - some point towards local traffic and some worldwide.
Our business name on all (including the content) is the same and so are telephone numbers.I think you're pretty safe as long as you've got the content re-written and meta tags at least a bit different between the sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
About duplicate content
We have to products: - loan for a new car
Intermediate & Advanced SEO | | KBC
- load for a second hand car Except for title tag, meta desc and H1, the content is of course very similmar. Are these pages considered as duplicate content? https://new.kbc.be/product/lenen/voertuig/autolening-tweedehands-auto.html
https://new.kbc.be/product/lenen/voertuig/autolening-nieuwe-auto.html thanks for the advice,0 -
SEO effect of content duplication across hub of sites
Hello, I have a question about a website I have been asked to work on. It is for a real estate company which is part of a larger company. Along with several other (rival) companies it has a website of property listings which receives a feed of properties from a central hub site - so lots of potential for page, title and meta content duplication (if if isn't already occuring) across the whole network of sites. In early investigation I don't see any of these sites ranking very well at all in Google for expected search phrases. Before I start working on things that might improve their rankings, I wanted to ask some questions from you guys: 1. How would such duplication (if it is occuring) effect the SEO rankings of such sites individually, or the whole network/hub collectively? 2. Is it possible to tell if such a site has been "burnt" for SEO purposes, especially if or from any duplication? 3. If such a site or the network has been totally burnt, are there any approaches or remedies that can be made to improve the site's SEO rankings significantly, or is the only/best option to start again from scratch with a brand new site, ensuring the use of new meta descriptions and unique content? Thanks in advance, Graham
Intermediate & Advanced SEO | | gmwhite9991 -
How do I use public content without being penalized for duplication?
The NHTSA produces a list of all recalls for automobiles. In their "terms of use" it states that the information can be copied. I want to add that to our site, so there is an up-to-date list for our audience to see. However, I'm just copying and pasting. I'm allowed to according to NHTSA, but google will probably flag it right? Is there a way to do this without being penalized? Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup1 -
Same website, seperate subfolders or separete websites? 12 stores in two cities
I have a situation where there are 12 stores in separate suburbs across two cities. Currently the chain store has one eCommerce website. So I could keep the one website with all the attendant link building benefits of one domain. I would keep a separate webpage for each store with address details to assist with some Local SEO. But (1) each store has slightly different inventory and (2) I would like to garner the (Local) SEO benefits of being in a searchers suburb. So I'm wondering if I should go down the subfolder route with each store having its own eCommerce store and blog eg example.com/suburb? This is sort of what Apple does (albeit with countries) and is used as a best practice for international SEO (according to a moz seminar I watched awhile back). Or I could go down the separate eCommerce website domain track? However I feel that is too much effort for not much extra return. Any thoughts? Thanks, Bruce.
Intermediate & Advanced SEO | | BruceMcG0 -
Moving Content To Another Website With No Redirect?
I've got a website that has lots of valuable content and tools but it's been hit too hard by both Panda and Penguin. I came to the conclusion that I'd be better off with a new website as this one is going to hell no matter how much time and money I put in it. Had I started a new website the first time it got hit by Penguin, I'd be profitable today. I'd like to move some of that content to this other domain but I don't want to do 301 redirects as I don't want to pass bad link juice. I know I'll lose all links and visitors to the original website but I don't care. My only concern is duplicate content. I was thinking of setting the pages to noindex on the original website and wait until they don't appear in Google's index. Then I'd move them over to the new domain to be indexed again. Do you see any problem with this? Should I rewrite everything instead? I hate spinning content...!
Intermediate & Advanced SEO | | sbrault741 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
HTTPS Duplicate Content?
I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?
Intermediate & Advanced SEO | | QuickLearnTraining0 -
"Duplicate" Page Titles and Content
Hi All, This is a rather lengthy one, so please bear with me! SEOmoz has recently crawled 10,000 webpages from my site, FrenchEntree, and has returned 8,000 errors of duplicate page content. The main reason I have so many is because of the directories I have on site. The site is broken down into 2 levels of hierachy. "Weblets" and "Articles". A weblet is a landing page, and articles are created within these weblets. Weblets can hold any number of articles - 0 - 1,000,000 (in theory) and an article must be assigned to a weblet in order for it to work. Here's how it roughly looks in URL form - http://www.mysite.com/[weblet]/[articleID]/ Now; our directory results pages are weblets with standard content in the left and right hand columns, but the information in the middle column is pulled in from our directory database following a user query. This happens by adding the query string to the end of the URL. We have 3 main directory databases, but perhaps around 100 weblets promoting various 'canned' queries that users may want to navigate straight into. However, any one of the 100 directory promoting weblets could return any query from the parent directory database with the correct query string. The problem with this method (as pointed out by the 8,000 errors) is that each possible permutation of search is considered to be it's own URL, and therefore, it's own page. The example I will use is the first alphabetically. "Activity Holidays in France": http://www.frenchentree.com/activity-holidays-france/ - This link shows you a results weblet without the query at the end, and therefore only displays the left and right hand columns as populated. http://www.frenchentree.com/activity-holidays-france/home.asp?CategoryFilter= - This link shows you the same weblet with the an 'open' query on the end. I.e. display all results from this database. Listings are displayed in the middle. There are around 500 different URL permutations for this weblet alone when you take into account the various categories and cities a user may want to search in. What I'd like to do is to prevent SEOmoz (and therefore search engines) from counting each individual query permutation as a unique page, without harming the visibility that the directory results received in SERPs. We often appear in the top 5 for quite competitive keywords and we'd like it to stay that way. I also wouldn't want the search engine results to only display (and therefore direct the user through to) an empty weblet by some sort of robot exclusion or canonical classification. Does anyone have any advice on how best to remove the "duplication" problem, whilst keeping the search visibility? All advice welcome. Thanks Matt
Intermediate & Advanced SEO | | Horizon0