SEO issues with IP based content delivery
-
Hi,
I have two websites say website A and Website B. The website A is set up for the UK audience and the website B is set up for the US audience. Both websites sell same products with some products and offers not available in either country. Website A can't be accessed if you are in US. Similarly website B can't be accessed if you are in UK. This was a decision made by the client long time ago as they don’t want to offer promotions etc in the US and therefore don’t want the US audience to be able to purchase items from the UK site.
Now the problem is both the websites have same description for the common products they sell.Search engine spiders tend to enter a site from a variety of different IP addresses/locations. So while a UK visitor will not be able to access the US version of the site and vice versa, a crawler can. Now i have following options with me:
1. Write a different product descriptions for US website to keep both the US and UK versions of the site in the Google Index for the foreseeable future. But this is going to be time consuming and expensive option as there are several hundred products which are common to both sites.
2. Use a single website to target both US and UK audience and make the promotions available only to the UK audience. There is one issue here. Website A address ends with '.co.uk' and website B has different name and ends with .com. So website A can't be used for the US audience. Also website A is older and more authoritative than the new website B. Also website A is pretty popular among UK audience with the .co.uk address. So website B can't be used to target the UK audience.
3. You tell me
-
Just a thought as well to add to what everyone else said. Make sure you go into your Google webmasters and tell Google what country you want them to rank up for. I have had odd instances when a site with a co.uk extension will still rank up in the US for terms even though I don't want it to. So I advise you to set them.
Have a nice day.
-
normally I would have said keep only one site but in understanding what you've said you need to differentiate the sites substantially not just wording. Brandon differently to be different the whole reason the customer wants them not be the same is because the audience is not the same I'm from Germany I do understand the difference between being pitch something in Germany and in the United States where I am now and I do notice my own behavior patterns to those I'm far more likely to buy something with the .de as I know which German I know there will be no issues when I'm in the United States I am far more likely to purchase something with the .com TDL as I know I will not have problems hopefully. Differentiate the sites is much as you can one of them sounds like the US site should be rewritten.
I hope I've helped you,
Thomas Von Zickell
-
You have the option to show different content depending on users location. If you use PHP on your site you can use PHP GEOIP functions.
You can get your site personalised by country here: http://www.maxmind.com/en/geolocation_landing
-
Great response and I agree with Big. Large e-commerce sites with thousands of SKUs that often only have tiny difference from one product to the next are bound to have many similar product descriptions. Yes, Amazon is a perfect example. Google is smart enoug to know if what you are doing with your two sites is "an attempt to get 2 bites of the cherry." It's pretty clear that you are trying to serve the most appropriate content to the most appropriate audience. Content management would be easier with everything all on one site, but with the history of these sites, it's probably best to keep them as they are. INow, if you had two domains in either the US or UK that had all the same identical product pages, that would be an issue.
-
Well what i would do its very simple, just have it all in one site and block ip's from user in the US to UK and the other way around.
so you have on your site 2 flags US and UK and if a user from a UK try to go to US site you show a message that that this products not available that say " this option not available from your location" (i am not a copy writer so use your own words) or just block the prices for ip from a different country.
hope that help
-
Oh yeah!
Thanks keri, I really din't check the date
-
Hi Khem,
This question is over a year old. Your best bet is to start a new thread with just this question. Thanks!
-
I would suggest to run only one web site and then use the visitors IP address to insert relevant content into the site. So the web site content will change according to where the visitor comes from.
Creating multiple domains in same language with identical content might attract penalty.
Or else, do whatever you're thinking but ensure to keep the content unique, even if you're using IP delivery.
| |
-
So, you mean to say that being in a same industry, I can copy the whole content of any UK website and then can restrict UK people from accessing my website, as my target audience is in US.
Please advice
-
If you keep the two sites separate would you will be penalized by Google for having duplicate content? if this is the case how should you deal with this?
-
Personally i would simply redirect your visitors to the proper web site associated through their IP address. There a few of server side tools or plugins if you're using a blog to change the entire sites title and body content to reflect the differences between sites at a click of a button.
affportal.com has many such tools and insights to help you with this.
hope this helps.
-
I say keep them separate. The ccTLD (.co.uk - clearly shows geographic relevance) - now the .com, which is a global TLD can be targeted/biased to the USA by using theGeographic target in Google Webmasters (the .co.uk is already set to UK and cannot be set to point to another country - only 'unlisted'.)
Furthermore, I wouldn't want to lose any benefits of the aged and established .co.uk by merging it with the .com. You will never get 100% of the link juice back with 301 redirects - maybe 90% at best and after some weeks have passed and then when you consider you are already well establised with your .co.uk site you would be mad to mess with it without VERY GOOD REASON!
Can't you just not restrict shipping to the US on the UK site? I have 2 ecommerce sites setup this way (one.co.uk and one .com - (which operates on a dropship basis only - as we are UK based)
With regards to the duplicate content issue - I would look at the fact that Amazon.co.uk and Amazon.com have hundreds of thousands of product pages with the same/VERY similar content (descriptions etc) - and last time I check they were ranking pretty well ;o) - without the need to block users from certain locations - (they do of course pickup your IP address instead and SUGGEST(with a big flag and arrows that you visit the UK site instead when you select Amazon.com site from the UK) - They still restrict shipping of certain items should you persist and try and order anyway.
Your prices will also differ between £ and $ - as will the converted price - another clear indication that this isn't an attempt to get 2 bites of the cherry.
I would also move the .com site to a US based server too - as this helps with ranking anyway (server/website speed and location are factors)
Maybe bung a flag in your header graphics to further denote your geo-targeting?
Changing the spelling of UK/US variants is sensible anyway - though difficult to research initially - I spent some time battling between ize and ise!
Keep the .Co.Uk and .Com separate - state that you do not ship to the US from the UK site - restrict purchases accordingly (by shipping address). That should make it clear enough - hope that helps!
-
Hi Devaki,
Are you still deciding what to do here, or have you gone ahead and made a decision? Let us know if we can help you out anymore, or tell us what your decision was -- we'd be interested to hear what choice was made and how it's worked out.
Thanks!
-
Duplicate content shouldn't be an issue with regards maintaining a US and UK site, the search engines will decide what version to show. Admittedly I'm not 100% convinced the are perfect at doing this at the moment, but confident enough I would do it myself in this case, so don't worry about rewriting all your content (though remember UK and US are one nation divided by a single language).
As a precaution you could georoute customers by IP to either or, but remember googlebot will probably crawl from a US address so you might want to let theirs passed.
To be honest unless you have a reason to do it I wouldn't change the set up you currently have. Bringing them under one domain is going to be awkward (though possible) to show geospecific content and as I mentioned I don't believe you even need to rewrite the content.
Just build UK links to the .co.uk site and you should be alright for the most part.
Is there a particular reason you feel you need to change the set-up?
-
Sorry, I must not be awake. What is the problem? You have two sites with common products and you only offer certain promotions in the UK on the UK ccTLD (country code top level domain). Are these promos showing up on the US site?
-
You have a few options.
You could build out one website and whenever a visitor comes from a specific country you could show that visitor separate pieces of the site, different products, etc... but this is not easily done and is a nightmare to manage.
You can keep the two separate websites and focus on rewriting the content. This is my first option, if it were me. You have two websites that are in separate countries selling products that are the same but with different offers, discounts, currencies, etc... so this makes the most sense to have a clear line of separation. However, it shouldn't be too difficult to hire a freelancer writer to go through one of the websites and rewrite the content. Make it more relevant to that countries users, add in videos, helpful information, etc... you only have to rewrite content for one of the sites, that would make sure they are not full of dupe content. Then, you could down the road hire the same writer to optimize the content for the other website but approach it with different content that is just as relevant and you should have a win-win-win situation.
Does that make sense?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mirrors Hosting SEO
Hello, I notice that some sites do provide hosting for mirroring file downloading for cpanel, apache and other big developers. I assume that benefits of it is link building but look at this site: http://apache.mirrors.hoobly.com/ Hoobly provide mirrors for most developers but none of them are indexed by google for ex: http://apache.mirrors.hoobly.com/ So can someone explain why would be that or what is going on ? is that something hidden here ?
Intermediate & Advanced SEO | | eriklogo50 -
Scraped content ranking above the original source content in Google.
I need insights on how “scraped” content (exact copy-pasted version) rank above the original content in Google. 4 original, in-depth articles published by my client (an online publisher) are republished by another company (which happens to be briefly mentioned in all four of those articles). We reckon the articles were re-published at least a day or two after the original articles were published (exact gap is not known). We find that all four of the “copied” articles rank at the top of Google search results whereas the original content i.e. my client website does not show up in the even in the top 50 or 60 results. We have looked at numerous factors such as Domain authority, Page authority, in-bound links to both the original source as well as the URLs of the copied pages, social metrics etc. All of the metrics, as shown by tools like Moz, are better for the source website than for the re-publisher. We have also compared results in different geographies to see if any geographical bias was affecting results, reason being our client’s website is hosted in the UK and the ‘re-publisher’ is from another country--- but we found the same results. We are also not aware of any manual actions taken against our client website (at least based on messages on Search Console). Any other factors that can explain this serious anomaly--- which seems to be a disincentive for somebody creating highly relevant original content. We recognize that our client has the option to submit a ‘Scraper Content’ form to Google--- but we are less keen to go down that route and more keen to understand why this problem could arise in the first place. Please suggest.
Intermediate & Advanced SEO | | ontarget-media0 -
Are there advantages of switching Websites to a private IP from an IP shared on a Webserver?
I just did a reverse IP Lookup for both my sites and noticed they were on shared WebServers with 370 and 719 domains respectively. A few domains hosted on each IP looked very suspicious too. Is there an advantage of switching my websites to a private IP from an IP shared on a Webserver?
Intermediate & Advanced SEO | | Anita_Clark0 -
Will implementing a 'Scroll to Div Anchor' cause a duplicate content issue?
I have just been building a website for a client with pages that contain a lot of text content. To make things easier for site visitors I have created a menu bar that sticks to the top of the page and the page will scroll to different areas of content (i/e different Div id anchors) Having done this I have just had the thought that this might inadvertently introduce duplicate content issue. Does anyone know if adding an #anchor to the end of a url will cause a duplicate content error in google? For example, would the following URLs be treated as different:- http://www.mysite.co.uk/services
Intermediate & Advanced SEO | | AdeLewis
http://www.mysite.co.uk/services#anchor1
http://www.mysite.co.uk/services#anchor2
http://www.mysite.co.uk/services#anchor3
http://www.mysite.co.uk/services#anchor4 Thanks.0 -
Joomla Plugins for SEO
Any input on which Joomla plugins could help us to facilitate the SEO on a client's site? Wordpress has some simple all-in-ones but we're not as familiar with Joomla and it doesn't look like that's the case. Thanks!
Intermediate & Advanced SEO | | MackenzieFogelson0 -
Duplicate Content issue on pages with Authority and decent SERP results
Hi, I'm not sure what the best thing to do here is. I've got quite a few duplicate page errors in my campaign. I must admit the pages were originally built just to rank a keyword variation. e.g. Main page keyword is [Widget in City] the "duplicate" page is [Black Widget in City] I guess the normal route to deal with duplicate pages is to add a canonical tag and do a 304 redirect yea? Well these pages have some page Authority and are ranking quite well for their exact keywords, what do I do?
Intermediate & Advanced SEO | | SpecialCase0 -
Predictive SEO
Hello all, I am trying to perform a predictive competitive SEO analysis to estimate what I will need to do to surpass my competitors. I am unsure of how to do this and would like some advice or link to an article. What I am trying to do is predict where I can rank in three months, six months and one year as well as what I need to do compared to my competitors. Specifically also to estimate how many links I would need to acquire to both my page as well as domain. I have already pulled my competitors domain links, page links, and age. Adam
Intermediate & Advanced SEO | | digitalops0 -
Serving different content based on IP location
I have city centric website. For sake of simplicity, say I only have 2 cities -- City A and City B. Depending on a user's IP address, they will either get City A or City B. Users can change their location through javascript on pages. But there is no cross-linking between cities. By this, I mean that unless you can read or execute javascript, there is no way for you to get from city A to City B. My concern is this: googlebot comes to my site, and we serve them up City A. How does City B get discovered if Googlebot doesn't read javascript? We have an xml sitemap plus plenty of backlinks to City B. Is this sufficient? Should I provide a static link to City B (and vice versa) on the homepage for crawling purposes?
Intermediate & Advanced SEO | | ChatterBlock0