SEO issues with IP based content delivery
-
Hi,
I have two websites say website A and Website B. The website A is set up for the UK audience and the website B is set up for the US audience. Both websites sell same products with some products and offers not available in either country. Website A can't be accessed if you are in US. Similarly website B can't be accessed if you are in UK. This was a decision made by the client long time ago as they don’t want to offer promotions etc in the US and therefore don’t want the US audience to be able to purchase items from the UK site.
Now the problem is both the websites have same description for the common products they sell.Search engine spiders tend to enter a site from a variety of different IP addresses/locations. So while a UK visitor will not be able to access the US version of the site and vice versa, a crawler can. Now i have following options with me:
1. Write a different product descriptions for US website to keep both the US and UK versions of the site in the Google Index for the foreseeable future. But this is going to be time consuming and expensive option as there are several hundred products which are common to both sites.
2. Use a single website to target both US and UK audience and make the promotions available only to the UK audience. There is one issue here. Website A address ends with '.co.uk' and website B has different name and ends with .com. So website A can't be used for the US audience. Also website A is older and more authoritative than the new website B. Also website A is pretty popular among UK audience with the .co.uk address. So website B can't be used to target the UK audience.
3. You tell me
-
Just a thought as well to add to what everyone else said. Make sure you go into your Google webmasters and tell Google what country you want them to rank up for. I have had odd instances when a site with a co.uk extension will still rank up in the US for terms even though I don't want it to. So I advise you to set them.
Have a nice day.
-
normally I would have said keep only one site but in understanding what you've said you need to differentiate the sites substantially not just wording. Brandon differently to be different the whole reason the customer wants them not be the same is because the audience is not the same I'm from Germany I do understand the difference between being pitch something in Germany and in the United States where I am now and I do notice my own behavior patterns to those I'm far more likely to buy something with the .de as I know which German I know there will be no issues when I'm in the United States I am far more likely to purchase something with the .com TDL as I know I will not have problems hopefully. Differentiate the sites is much as you can one of them sounds like the US site should be rewritten.
I hope I've helped you,
Thomas Von Zickell
-
You have the option to show different content depending on users location. If you use PHP on your site you can use PHP GEOIP functions.
You can get your site personalised by country here: http://www.maxmind.com/en/geolocation_landing
-
Great response and I agree with Big. Large e-commerce sites with thousands of SKUs that often only have tiny difference from one product to the next are bound to have many similar product descriptions. Yes, Amazon is a perfect example. Google is smart enoug to know if what you are doing with your two sites is "an attempt to get 2 bites of the cherry." It's pretty clear that you are trying to serve the most appropriate content to the most appropriate audience. Content management would be easier with everything all on one site, but with the history of these sites, it's probably best to keep them as they are. INow, if you had two domains in either the US or UK that had all the same identical product pages, that would be an issue.
-
Well what i would do its very simple, just have it all in one site and block ip's from user in the US to UK and the other way around.
so you have on your site 2 flags US and UK and if a user from a UK try to go to US site you show a message that that this products not available that say " this option not available from your location" (i am not a copy writer so use your own words) or just block the prices for ip from a different country.
hope that help
-
Oh yeah!
Thanks keri, I really din't check the date
-
Hi Khem,
This question is over a year old. Your best bet is to start a new thread with just this question. Thanks!
-
I would suggest to run only one web site and then use the visitors IP address to insert relevant content into the site. So the web site content will change according to where the visitor comes from.
Creating multiple domains in same language with identical content might attract penalty.
Or else, do whatever you're thinking but ensure to keep the content unique, even if you're using IP delivery.
| |
-
So, you mean to say that being in a same industry, I can copy the whole content of any UK website and then can restrict UK people from accessing my website, as my target audience is in US.
Please advice
-
If you keep the two sites separate would you will be penalized by Google for having duplicate content? if this is the case how should you deal with this?
-
Personally i would simply redirect your visitors to the proper web site associated through their IP address. There a few of server side tools or plugins if you're using a blog to change the entire sites title and body content to reflect the differences between sites at a click of a button.
affportal.com has many such tools and insights to help you with this.
hope this helps.
-
I say keep them separate. The ccTLD (.co.uk - clearly shows geographic relevance) - now the .com, which is a global TLD can be targeted/biased to the USA by using theGeographic target in Google Webmasters (the .co.uk is already set to UK and cannot be set to point to another country - only 'unlisted'.)
Furthermore, I wouldn't want to lose any benefits of the aged and established .co.uk by merging it with the .com. You will never get 100% of the link juice back with 301 redirects - maybe 90% at best and after some weeks have passed and then when you consider you are already well establised with your .co.uk site you would be mad to mess with it without VERY GOOD REASON!
Can't you just not restrict shipping to the US on the UK site? I have 2 ecommerce sites setup this way (one.co.uk and one .com - (which operates on a dropship basis only - as we are UK based)
With regards to the duplicate content issue - I would look at the fact that Amazon.co.uk and Amazon.com have hundreds of thousands of product pages with the same/VERY similar content (descriptions etc) - and last time I check they were ranking pretty well ;o) - without the need to block users from certain locations - (they do of course pickup your IP address instead and SUGGEST(with a big flag and arrows that you visit the UK site instead when you select Amazon.com site from the UK) - They still restrict shipping of certain items should you persist and try and order anyway.
Your prices will also differ between £ and $ - as will the converted price - another clear indication that this isn't an attempt to get 2 bites of the cherry.
I would also move the .com site to a US based server too - as this helps with ranking anyway (server/website speed and location are factors)
Maybe bung a flag in your header graphics to further denote your geo-targeting?
Changing the spelling of UK/US variants is sensible anyway - though difficult to research initially - I spent some time battling between ize and ise!
Keep the .Co.Uk and .Com separate - state that you do not ship to the US from the UK site - restrict purchases accordingly (by shipping address). That should make it clear enough - hope that helps!
-
Hi Devaki,
Are you still deciding what to do here, or have you gone ahead and made a decision? Let us know if we can help you out anymore, or tell us what your decision was -- we'd be interested to hear what choice was made and how it's worked out.
Thanks!
-
Duplicate content shouldn't be an issue with regards maintaining a US and UK site, the search engines will decide what version to show. Admittedly I'm not 100% convinced the are perfect at doing this at the moment, but confident enough I would do it myself in this case, so don't worry about rewriting all your content (though remember UK and US are one nation divided by a single language).
As a precaution you could georoute customers by IP to either or, but remember googlebot will probably crawl from a US address so you might want to let theirs passed.
To be honest unless you have a reason to do it I wouldn't change the set up you currently have. Bringing them under one domain is going to be awkward (though possible) to show geospecific content and as I mentioned I don't believe you even need to rewrite the content.
Just build UK links to the .co.uk site and you should be alright for the most part.
Is there a particular reason you feel you need to change the set-up?
-
Sorry, I must not be awake. What is the problem? You have two sites with common products and you only offer certain promotions in the UK on the UK ccTLD (country code top level domain). Are these promos showing up on the US site?
-
You have a few options.
You could build out one website and whenever a visitor comes from a specific country you could show that visitor separate pieces of the site, different products, etc... but this is not easily done and is a nightmare to manage.
You can keep the two separate websites and focus on rewriting the content. This is my first option, if it were me. You have two websites that are in separate countries selling products that are the same but with different offers, discounts, currencies, etc... so this makes the most sense to have a clear line of separation. However, it shouldn't be too difficult to hire a freelancer writer to go through one of the websites and rewrite the content. Make it more relevant to that countries users, add in videos, helpful information, etc... you only have to rewrite content for one of the sites, that would make sure they are not full of dupe content. Then, you could down the road hire the same writer to optimize the content for the other website but approach it with different content that is just as relevant and you should have a win-win-win situation.
Does that make sense?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Way to Create SEO Content for Multiple, International Websites
I have a client that has multiple websites for providing to other countries. For instance, they have a .com website for the US (abccompany.com), a .co.uk website for the UK (abccompany.co.uk), a .de website for Germany (abccompany.de), and so on. The have websites for the Netherlands, France, and even China. These all act as separate websites. They have their own addresses, their own content (some duplicated but translated), their own pricing, their own Domain Authority, backlinks, etc. Right now, I write content for the US site. The goal is to write content for long and medium tail keywords. However, the UK site is interested in having myself write content for them as well. The issue I'm having is how can I differentiate the content? And what is the best way to target content for each country? Does it make sense to write separate content for each website to target results in that country? The .com site will still show up in UK web results still fairly high. Does it make sense to just duplicate the content but in a different language or for the specific audience in that country? I guess the biggest question I'm asking is, what is the best way of creating content for multiples countries' search results? I don't want the different websites to compete with each other in a sense nor do I want to spend extra time trying to rank content for multiple sites when I could just focus on trying to rank one for all countries. Any help is appreciated!
Intermediate & Advanced SEO | | cody1090 -
Client wants to show 2 different types of content based on cookie usage - potential cloaking issue?
Hi, A client of mine has compliance issues in their industry and has to show two different types of content to visitors: domain.com/customer-a/about-us domain.com/customer-b/about-us Next year, they have to increase that to three different types of customer. Rather than creating a third section (customer-c), because it's very similar to one of the types of customers already (customer-b), their web development agency is suggesting changing the content based on cookies, so if a user has indentified themselves as customer-b, they'll be shown /customer-b/, but if they've identified themselves as customer-c, they'll see a different version of /customer-b/ - in other words, the URL won't change, but the content on the page will change, based on their cookie selection. I'm uneasy about this from an SEO POV because: Google will only be able to see one version (/customer-b/ presumably), so it might miss out on indexing valuable /customer-c/ content, It makes sense to separate them into three URL paths so that Google can index them all, It feels like a form of cloaking - i.e. Google only sees one version, when two versions are actually available. I've done some research but everything I'm seeing is saying that it's fine, that it's not a form of cloaking. I can't find any examples specific to this situation though. Any input/advice would be appreciated. Note: The content isn't shown differently based on geography - i.e. these three customers would be within one country (e.g. the UK), which means that hreflang/geo-targeting won't be a workaround unfortunately.
Intermediate & Advanced SEO | | steviephil0 -
Google Fetch Issue
I'm having some problems with what google is fetching and what it isn't, and I'd like to know why. For example, google IS fetching a non-existent page but listing it as an error: http://www.gaport.com/carports but the actual url is http://www.gaport.com/carports.htm. Google is NOT able to fetch http://www.gaport.com/aluminum/storage-buildings-10x12.htm. It says the page doesn't exist (even though it does) and when I click on the not found link in Google fetch it adds %E@%80%8E to the url causing the problem. One theory we have is that this may be some sort of server/hosting problem, but that's only really because we can't figure out what we could have done to cause it. Any insights would be greatly appreciated. Thanks and Happy Holidays! Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Product pages content
Hi! I'm doing some SEO work for a new client. I've been tasked with boosting some of their products, such as http://www.lawnmowersdirect.co.uk/product/self-propelled-rear-roller-rotary-petrol-lawnmowers/honda-hrx426qx. It's currently #48 for the term Honda Izy HRG465SD, while http://www.justlawnmowers.co.uk/lawnmowers/honda-izy-hrg-465-sd.htm is #2, behind Amazon. Regarding links, there's no great shakes between the pages or even the domains. However, there's major difference in content. I'm happy to completely revamp it, I just wanted to check I'm not missing anything out before starting to rewrite it altogether! Thanks
Intermediate & Advanced SEO | | neooptic0 -
SEOMoz Internal Dupe. Content & Possible Coding Issues
SEOmoz Community! I have a relatively complicated SEO issue that has me pretty stumped... First and foremost, I'd appreciate any suggestions that you all may have. I'll be the first to admit that I am not an SEO expert (though I am trying to be). Most of my expertise is with PPC. But that's beside the point. Now, the issues I am having: I have two sites: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx A lot of our SEO efforts thus-far have done good for Federal Auto Loan... and we are seeing positive impacts from them. However, we recently did a server transfer (may or may not be related)... and since that time a significant number of INTERNAL duplicate content pages have appeared through the SEOmoz crawler. The number is around 20+ for both Federal Auto Loan and Federal Mortgage Services (see attachments). I've tried to include as much as I can via the attachments. What you will see is all of the content pages (articles) with dupe. content issues along with a screen capture of the articles being listed as duplicate for the pages: Car Financing How It Works A Home Loan is Possible with Bad Credit (Please let me know if you could use more examples) At first I assumed it was simply an issue with SEOmoz... however, I am now worried it is impacting my sites (I wasn't originally because Federal Auto Loan has great quality scores and is climbing in organic presence daily). That being said, we recently launched Federal Mortgage Services for PPC... and my quality scores are relatively poor. In fact, we are not even ranking (scratch that, not even showing that we have content) for "mortgage refinance" even though we have content (unique, good, and original content) specifically around "mortgage refinance" keywords. All things considered, Federal Mortgage Services should be tighter in the SEO department than Federal Auto Loan... but it is clearly not! I could really use some significant help here... Both of our sites have a number of access points: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx are both the designated home pages. And I have rel=canonical tags stating such. However, my sites can also be reached via the following: http://www.federalautoloan.com http://www.federalautoloan.com/default.aspx http://www.federalmortgageservices.com http://www.federalmortgageservics.com/default.aspx Should I incorporate code that "redirects" traffic as well? Or is it fine with just the relevancy tags? I apologize for such a long post, but I wanted to include as much as possible up-front. If you have any further questions... I'll be happy to include more details. Thank you all in advance for the help! I greatly appreciate it! F7dWJ.png dN9Xk.png dN9Xk.png G62JC.png ABL7x.png 7yG92.png
Intermediate & Advanced SEO | | WPColt0 -
Organic SEO impact of switching from Dedicated server/IP to cloud?
My client wants to move from a dedicated server with unique dedicated IP address to a cloud server. We have great rankings for competitive terms. I believe their motivation is to cut costs. What is the risk to the rankings in switching from dedicated to cloud? I don't believe unique static ips are available on a cloud platform. I told him I would strongly advise against it, don't risk it, but would appreciate others' feedback and experiences to take into consideration. Thanks, Greg
Intermediate & Advanced SEO | | seagreen0 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0 -
Large Site SEO - Dev Issue Forcing URL Change - 301, 302, Block, What To Do?
Hola, Thanks in advance for reading and trying to help me out. A client of mine recently created a large scale company directory (500k+ pages) in Drupal v6 while the "marketing" type pages of their site was still in manual hard-coded HTML. They redesigned their "marketing" pages, but used Drual v7. They're now experiencing server conflicts with both instances of Drupal not allowing them to communicate/be on the same server. Eventually the directory will be upgraded to Drupal v7, but could take weeks to months the client does not want to wait for the re-launch. The client wants to push the new marketing site live, but also does not want to ruin the overall SEO value of the directory and have a few options, but I'm looking to help guide them down the path of least resistance: Option 1: Move the company directory onto a subdomain and the "marketing site" on the www. subdomain. Client gets to push their redesign live, but large scale 301s to the directory cause major issues in terms of shaking up the structure of the site causing ripple effects into getting pulled out of the index for days to weeks. Rankings and traffic drop, subdomain authority gets lost and the company directory health looks bad for weeks to months. However, 301 maintains partial SEO value and some long tail traffic still exists. Once the directory gets moved to Drupal v7, the directory will then cancel the 301 to the subdomain and revert back to original www. subdomain URLs Option 2: Block the company directory from search engines with robots.txt and meta instructions, essentially cutting off the floodgates from the established marketing pages. No major scaling 301 ripple effect, directory takes a few weeks to filter out of the index, traffic is completely lost, however once drupal v7 gets upgraded and the directory is then re-opened, directory will then slowly gain back SEO value to get close to old rankings, traffic, etc. Option 3: 302 redirect? Lose all accumulate SEO value temporarily... hmm Option 4: Something else? As you can see, this is not an ideal situation. However, a decision has to be made and I'm looking to chose the lesser of evils. Any help is greatly appreciated. Thanks again -Chris
Intermediate & Advanced SEO | | Bacon0