Multiple Sites, multiple locations similar / duplicate content
-
I am working with a business that wants to rank in local searches around the country for the same service. So they have websites such as OURSITE-chicago.com and OURSITE-seattle.com -- All of these sites are selling the same services, but with small variations in each state due to different legal standards in the state. The current strategy is to put up similar "local" websites with all the same content.
So the bottom line is that we have a few different sites with the same content. The business wants to go national and is planning a different website for each location. In my opinion the duplicate content is a real problem. Unfortunately the nature of the service makes it so that there aren't many ways to say the same thing on each site 50 times without duplicate content. Rewriting content for each state seems like a daunting task when you have 70+ pages per site.
So, from an SEO standpoint we have considered:
-
Using the canonocalization tag on all but the central site... I think this would hurt all of the websites SERPs because none will have unique content.
-
Having a central site with directories OURSITE.com/chicago -- but this creates a problem because we need to link back to the relevant content in the main site and ALSO have the unique "Chicago" content easily accessable to Chicago users while having Seattle users able to access their Seattle data. The best way we thought to do this was using a frame with a universal menu and a unique state based menu... Also not a good option because of frames will also hurt SEO.
-
Rewrite all the same content 50 times.
You can see why none of these are desirable options. But I know that plenty of websites have "state maps" on their main site. Is there a way to accomplish this in a way that doesn't make our copywriter want to kill us?
-
-
Without knowing the terms, or the type of content it is hard to say what would work best. I know that if you rank well for the topic in general you will rank well for localized terms using localized landing pages if each page has unique enough content.
So if you rank well for Blue Widget Company, and you setup a section of the site for "locations" or "Service Areas" and then build out a "Dallas Blue Widgets" page that talks about your location in Dallas, your contact info, and staff that serve Dallas, and Maybe a short story on what you have done for people in Dallas you should do relatively well. From there you can link back to your generic "Blue Widget" page.
Obviously the success of that strategy will depend on how competitive each market is for that term, and probably on how unique your regional content is as well.
If there is anything to be learned from the panda update it should be that going the route of serving the user with your local content is Key. So providing "service areas" and then creating somewhat unique content for that market should be a good way to go. This also helps you consolidate all of your links.
I would rather spend time writing 50x unique pieces of content than try to get good links for EACH site.
Just my thoughts anyway.
EDIT: This way is also MUCH easier to track in analytics, and helps you consolidate all of your tracking, so for efficiency, and for flexibility I say this route wins in the long run
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we still have this Page Rank / Link juice / Link equity? So this dilution concept?
Hi all, As per the traditional or standard SEO rules, we have this link juice and dilution concept. Many websites have changed their linking structure with this with the beleif "the more number of pages, the PR will get diluted". Then many websites avoided more number of pages from homepage to avoid link juice dilution. Even we followed same. But I just wonder it's still the same way Google handles websites and rankings as per the links. And many websites even avoid more number of 2nd tier/hierarchy pages to avoid link dilution. I have gone through our competitors where they been employing lot of top level pages like 2nd tier/hierarchy pages but still doing good at rankings. Please share your views and suggestions on this. Thanks
Web Design | | vtmoz0 -
Any body can help me to make my web site seo freindly?
any body can help me to make my web site seo freindly? i have not big budget please email me fabric35@hotmail.com
Web Design | | fabric-fabric0 -
Above the Fold Content - Use of large images
Hi All, Our designers have come to the SEO team to ask if have a large image across the top of the page taking up a large majority of the above the fold real estate will impact our SEO. Our initial thoughts are no as long as we have an optimised H1 visibal to the user landing there which informs them what the page is about. Any thoughts would be appreciated.
Web Design | | J_Sinclair1 -
Sitemap Question - Very Old Ecommerce Site, Never Used A Map
I help manage a family website, that has about 10,000 products... It was top ranked since 1996, then got smacked by Penguin and recovered but its still receiving only a fraction of the natural traffic it used to get. Something we have never used... Is a sitemap. I'm curious if anyone knows reliable software that will generate a sitemap? My cart is custom built, website uses html pages across the board. Dynamic content and parameters are set up properly, onsite seo is in the excellent range. The only thing that I haven't been utilizing is a sitemap. Because the cart was hand built, it would a huge convenience to use a lightweight program thats compatible with any website, has parameter settings, exclusions and anything else useful to negate any duplicate content. I have a few highly dynamic pages as well... If anyone knows a product or a possible solution, it would be much appreciated. Working it up myself would be very time consuming. Thx
Web Design | | Southbay_Carnivorous_Plants0 -
How to link to a site without passing ANY linkjuice (other than simply nofollowing)
I have heard that there are other ways of linking to a site, to completely avoid passing any seo value I think it was even in a whiteboard friday video where I saw Rand say something about doing a 307 "temporary" redirect, or something like that? Basically, I want to let my customers compare our prices with ebay, but I don't want to have ebay outrank us (for obvious reasons) Any help?
Web Design | | TylerAbernethy0 -
Does Google have problem crawling ssl sites?
We have a site that was ranking well and recently dropped in traffic and ranking. The whole site is https and and not just the shopping pages. Thats the way the server is setup, they make whole site https. My manager thinks the drop in ranking is due to google not crawling https. I think contrary, but would like some feedback on this. Site is here
Web Design | | anthonytjm0 -
Site Activity, SEO, and behind login
I have a site that provides online education and as such, most of the user activity happens behind a login. This has me thinking about potential SEO impacts with a few questions that maybe someone could lend some light on: How important is activity (above just search activity) to the search engines Would it help to enter these pages, even though they're behind a login, into GA as we have with the front-end of the site Does a subdomain make a difference (right now we implement the course as a subdomain of the main site Lastly, as I was looking at compete.com, I am wondering how they get these use statistics?
Web Design | | uwaim20120 -
Google Bot cannot see the content of my pages
When I go to Google Webmaster tools and I type in any URL from the site http://www.ccisolutions.com in the "Fetch as Google Bot" feature, and then I click the link that says "success," Google bot is seeing my pages like this: <code>HTTP/1.1 200 OK Date: Tue, 26 Apr 2011 19:11:50 GMT Server: Apache/2.2.6 (Unix) mod_ssl/2.2.6 OpenSSL/0.9.7a DAV/2 PHP/5.2.4 mod_jk/1.2.25 Set-Cookie: CCISolutions-UT-Status=66.249.72.55.1303845110495128; path=/; expires=Thu, 25-Apr-13 19:11:50 GMT; domain=.ccisolutions.com Last-Modified: Tue, 28 Oct 2008 14:36:45 GMT ETag: "314b26-5a-2d421940" Accept-Ranges: bytes Content-Length: 90 Keep-Alive: timeout=15, max=99 Connection: Keep-Alive Content-Type: text/html Any clue as to why this could be happening?</code>
Web Design | | danatanseo0