Approach for an established site looking to serve different content to regions in a single country/lang
-
Hi guys,
I have an established site that currently serves the same content to all regions - west and east - in a single country with the same language.
We are now looking to vary the content across west and east regions - not dramatically, but the products offered will be slightly different.
From what i gather, modifying the url is best for countries, so feels like overkill for regions within the same country. I'm also unlikely to have very unique content, outside of the varied products, so I'm mindful of duplicate/similar content, but I know I can use canonical tags to address.
I have a fairly modern CMS that can target content based on region, but mindful of upsetting Google re; showing different content to what the bot might encounter, assuming this is still a thing.
So, three questions from an SEO perspective -
-
Do i need to really focus on changing my url structure, especially as I'm already established in a competitive market, or will I do more harm than good? Is the region in the URL a strong signal?
-
If I should make some changes to the url and/or metadata, what are the best bang for buck changes you would make?
-
How does Google Local fit into this? Is it a separate process via webmaster tools, or does it align to the above changes?
Cheers!!!
Jez
-
-
Thanks Rob, clear answer, much appreciated
Jez
-
Hello Jez,
This is a fairly straightforward process:
-
Changing your url structure is unlikely to do you much good, especially if you are already established in your industry. You could try it out and see if rankings improve, but chances are it will be a lot of fruitless time and effort for not much (if any) gain. Region can be a strong signal for brick and mortar businesses from a local perspective (at the city/locality level) but it is unlikely to matter if you are serving full regions of an entire country. Google would likely interpret this as irrelevancy in areas that are not centered on a location you choose.
-
The url represents a "hard" ranking factor (as in, it is a technical SEO ranking factor), but probably isn't what you should be focusing on here. Meta data and tags are "soft" ranking factors (as in, they affect customer UX and can affect positive ranking signals from Google based on user metrics), so you might put some time into these if you feel they could use some work. For example, if you are targeting the Western region, you might specify this in meta tags so potential customers in the Western region will be more likely to click through, while Eastern customers might not. This will help with potential Bounce Rate, thereby indirectly helping your ranking potential.
-
Google Local is normally reserved for businesses operating in a given geographic area - for example, a business operating solely in a single city. Improving your Local profile for local customers is a good idea, but given the scope of your project, it is not likely going to be advantageous for you to divert too much of your time or resources to it.
Overall, region in the url is still a ranking factor, but this is trending towards becoming less important as time progresses, especially as "exact match" ranking declines. As I stated above, making alterations and testing the effects will let you know which direction to move for sure, but it carries risk with it. I would focus on maintaining quality (non-duplicate) content in 2 regions set from the same domain - new technical changes will make this more profitable for you in the long run than attempting to make exact matches in your url's or altering your site structure.
Cheers!
Rob
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there an SEO advantage to blog content being a child of /blog/ rather than the homepage?
I'm working on a website where all the blog content is listed as separate pages from the homepage, eg: www.domain.com/first-blog-post
Technical SEO | | MillyShaw
www.domain.com/second-blog-post However, it would make my life easier if all blog content was listed under /blog/ so that I could analyse it better in Google Analytics. Eg I'd like it to be: www.domain.com/blog/first-blog-post
www.domain.com/blog/second-blog-post The developer is not keen because it would create extra work for him, and he's also said it's a bad idea from an SEO point of view. But is this the case? Presumably with 301s in place it wouldn't make a difference? Thanks for your help!0 -
How to redirect my .com/blog to my server folder /blog ?
Hello SEO Moz ! Always hard to post something serious for the 04.01 but anyway let's try ! I'm releasing Joomla websites website.com, website.com/fr, website.com/es and so on. Usually i have the following folders on my server [ROOT]/com [ROOT]/com/fr [ROOT]/com/es However I would like to get the following now (for back up and security purpose). [ROOT]/com [ROOT]/es [ROOT]/fr So now what can I do (I gues .htaccess) to open the folder [ROOT]/es when people clic on website.com/es ? It sounds stupid but I really don't know. I found this on internet but did not answer my needs. .htaccess RewriteEngine On
Technical SEO | | AymanH
RewriteCond %{REQUEST_URI} !(^/fr/.) [NC]
RewriteRule ^(.)$ /sites/fr/$1 [L,R=301] Tks a lot ! Florian0 -
Duplicate content vs. less content
Hi, I run a site that is currently doing very well in google for the terms that we want. We are 1,2 or 3 for our 4 targeted terms, but havent been able to jump to number one in two categories that I would really like to. In looking at our site, I didn't realize we have a TON of duplicate content as seen by SEO moz and I guess google. It appears to be coming from our forum, we use drupal. RIght now we have over 4500 pages of duplicate content. Here is my question: How much is this hurting us as we are ranking high. Is it better to kill the forum (which is more community service than business) and have a very tight site SEO-wise, or leave the forum even with the duplicate content. Thanks for your help. Erik
Technical SEO | | SurfingNosara0 -
One site per location or all under and umbrella site?
I am working on a project where we are re-branding lots (100+) existing local business under one national brand. I am wondering what we should do with their existing websites, they are generally fairly poor and will need re-designing to match the new brand but may have some residual links? 301 redirect the URL to the national site, e.g. nationalsite.com/localbusinessA? If so what should I look out for? Do I need to specifically redirect any pages that have links to them to the same pages on the new site? Or should I give them a new standalone website that they link back to the national brand site? More than likely this will be hosted on the same server and CMS as the main site just the URL will remain Do I need to make sure that any old URL's that had links to them are 301'd to the new pages? Many thanks for you advice.
Technical SEO | | BadgerToo0 -
Duplicate content issues, I am running into challenges and am looking for suggestions for solutions. Please help.
So I have a number of pages on my real estate site that display the same listings, even when parsed down by specific features and don't want these to come across as duplicate content pages. Here are a few examples: http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html?feature=waterfront http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html This happens to be a waterfront community so all the homes are located along the waterfront. I can use a canonical tag, but I not every community is like this and I want the parsed down feature pages to get index. Here is another example that is a little different: http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=without-pool http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=4-bedrooms http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=waterfront So all the listings in this community happen to have 4 bedrooms, no pool, and are waterfront. Meaning that they display for each of the parsed down categories. I can possible set something that if the listings = same then use canonical of main page url, but in the next case its not so simple. So in this next neighborhood there are 48 total listings as seen at: http://luxuryhomehunt.com/homes-for-sale/windermere/isleworth.html and being that it is a higher end neighborhood, 47 of the 48 listings are considered "traditional listings" and while it is not exactly all of them it is 99%. Any recommendations is appreciated greatly.
Technical SEO | | Jdubin0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0 -
Multiple Region/Language Solutions
So I understand that this is a fairly broad question but I am trying to work through this on a bunch of different levels with a bunch of different sites that have multiple different issues. First I am wondering if I have an e-commerce site on a .com that is used to serve to different languages and locales around the world. Instead of a Domain.com/ES/ for a site that is supposed to serve Spain and a Domain.com/DE/ for a site that is supposed to serve Germany, we do Domain.com/en_ES/ and Domain.com/es_ES/ for an English and a Spanish version for our consumers that come from Spain. My first question is this a bad way to set this up just from a structure standpoint and my second question is what do I do about duplicate content on different locales but same languages? I am afraid that if I rel=canonical this to 1 region for each language that it may not show up in SE's for other regions but the same language. (Example Brazil and Portugal for Portuguese, Belgium and Netherlands for Dutch, Canada and France for French, Spain and Mexico for Spanish, etc...) Second do the language meta tags actually do anything or not? I am finding mixed opinions on this. Third what is the IDEAL website structure for a website that will serve multiple languages and locales from the same ccTLD? I understand this is not ideal but what is the best setup with this situation? Again I know this is a broad question but I am coming across a lot of e-commerce sites wanting help and dealing with this situation. The duplicate content thing is worrisome and I want good, localized indexing. Thanks!
Technical SEO | | DRSearchEngOpt0