International Website Targeting
-
Hello fellow Mozzers, had a quick question.
So we have a new eCommerce client that is interested in launching a website in multiple countries. According to their vision, they want a US site, UK site, Japan site, etc and so on.
I have a few concerns about doing it this way.
First, there is the issue with the sites being the same. They only difference will be that they have a different domain, such as domain.co.jp for the Japan-based site, domain.co.uk for UK, etc.
Even if we target different countries in webmaster, won't the sites still compete with one another and potentially get tagged as duplicates?
I'm thinking there has to be a better way to have a site targeted at the world, without having to clone and duplicate and relaunch. Anyone have experience with this?
-
Thank you for your response. It appears the best way to go about this is to make the main site amazing and optimize it around what they sell, correct? This is what I had in mind, but just wanted to check with the community to be sure.
-
Hi David.
You are right... creating three identical sites on three different geo-targeted domain names can be a problem.
However, if you implement the hreflang tags in order to tell Google what URL to show depending of the geography of the targeted user (see more here: https://support.google.com/webmasters/answer/189077?hl=en) will avoid the risk of duplicated content and to see the stronger domain outranking the one meant, for instance, for Japan in Japan.
Said that, this "full duplication" strategy can be only very temporary, because it does not have much sense targeting users living in different countries with the exact matching content:
- American English is different than British English;
- In Japan people simply don't search in English, and maybe neither use Google as main search engine (it is Yahoo).
- The culture, hence how people use internet and search on the web, is different in the three countries (slightly different in the case of USA and UK, enormously in the case of Japan).
So... yes, the strategy proposed is not the most effective one, despite of the advantages of the hreflang mark up implementation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy for dissolving an innocently created link network with over 100 websites?
Hello Moz Community, Over many years 120 websites were created all under a couple different organizations around the globe. The sites are interconnected via anchor text and domain name links and some redirect to larger sites. The teachings have a central theme and many tools, training programs, events, locations and services are offered on many different websites. Attached is a slice of a Majestic Link Graph showing the network. God bless Majestic for this new tool! We are looking for solutions that are efficient and effective in regards to usability, rankings and being achievable. Thank you so much for your help! Donna EJhNPqT
White Hat / Black Hat SEO | | Awakening-Mind0 -
Why google is catching my website late
Hello, I hope you all guys are doing great. Recently, I published my over my website and within almost 10 mins, it was indexed completely and I also personally checked it in google search console. The URL was indexed but the problem is, it does not appear in Google Search. Sometimes in search result I notice Google shows a result who is published 10-30 mins ago but this is not the case with my website. All articles just show in Google SERP after 1-2 days. What can be the reason behind this, although DA, PA is good (28-31).
White Hat / Black Hat SEO | | HansiAliya0 -
Scraping Website and Using Our Clients Info
One of our clients on Moz has noticed that another website has been scraping their website and pulling lots of their content without permission. We would like to notify Google about this company but are not sure if that is the right remedy to correct the problem. They appear in search results on Google using the client's name so they seem to be use page titles etc with the client's name in them. Several of the SERP links link to their own website but it pulls in our client's web page. Was hoping anyone could perhaps provide some additional options on how to attack this problem?
White Hat / Black Hat SEO | | InTouchMK0 -
How to find if a website has paid or spammy back-links? Latest ways to investigate.
Hi all, I would like to investigate about our website back-links if something is wrong. If there are any paid or spammy back-links. How to proceed on this exercise? We have been using ahrefs and seems like it's quite enough. Is there any way we can pull out the fishy back-links? Do we have any helpful data from webmasters about this? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Best method to target similar keywords??
Hi Guys, We have client that wants to target 3 similar terms (used, secondhand and pre-owned) variations. We have been having a discussion about the different methods to try but can't make a decision on the best route. The target page has a list of pre-owned products so whichever route was take these products still need to be visible without creating duplicate content issues.... 1 - Go all in on one page do our best at optimising a single page for all 3. - i don't like this route.
White Hat / Black Hat SEO | | Kal-SEO
2 - Stick with the current pre-owned url and create a url for used and secondhand with a 301 redirect back to the pre-owned url.
3 - Create three individual pages aimed at a keyword individually, keep the pre-owned as the original and add canonical links to used and secondhand I look forward to hearing your thoughts.
Thanks in advanced0 -
Website Vulnerability Leading to Doorway Page Spam. Need Help.
Keywords he is ranking for , houston dwi lawyer, houston dwi attorney and etc.. Client was acquired in June and since then we have done nothing but build high quality links to the website. None of our clients were dropped/dinged or impacted by the panda/penguin updates in 2012 or updates previously published via Google. Which proves we do quality SEO work. We went ahead and started duplicating links which worked for other legal clients and 5 months later this client is either dropping or staying in local maps results and we are performing very badly in organic results. Some more history..... When he first engaged our company we switched his website from a CMS called plone to word press. During our move I ran some searches to figure out which pages we needed to 301 and we came across many profile pages or member pages created on the clients CMS (PLONE). These pages were very spammy and linked to other plone sites using car model,make,year type keywords (ex:jeep cherokee dealerships). I went through these sites to see if they were linking back and could not find any back links to my clients website. Obviously nobody authorized these pages, they all looked very hackish and it seemed as though there was a vulnerability on his plone CMS installation which nobody caught. Fast forward 5 months and the newest OSE update is showing me a good 50+ back links with unrelated anchor text back links. These anchor text links are the same color as the background and can only be found if you hover your mouse over certain areas of the site. All of these sites are built on Plone and allot of them are linked to other businesses or community websites. These websites obviously have no clue they have been hacked or are being used for black hat purposes. There are dozens of unrelated anchor text links being used on external websites which are pointing back to our clients website. Examples: <a class="clickable title link-pivot" title="See top linking pages that use this anchor text">autex Isuzu, </a><a class="clickable title link-pivot" title="See top linking pages that use this anchor text">Toyota service department ratings, </a><a class="clickable title link-pivot" style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;" title="See top linking pages that use this anchor text">die cast BMW and etc..</a> Obviously the first step is to use the disavow link tool, which will be completed this week. The second step is to take some feedback from the SEO community. It seems like these pages are automatically created using some type of bot. It will be very tedious if we have to continually remove these links. I hope there is a way to notify Google that these websites are all plone and have a vulnerability, which black hats are using to harm the innocent... If i cannot get Google to handle this, then the only other option is to start fresh with a new domain name. What would you do in this situation. Your help is greatly appreciated. Thank you
White Hat / Black Hat SEO | | waqid0 -
How to rank internal pages?
Hello, I have a website about consoles, on the homepage are a few thoughts about what consoles are and a short history. The main attraction are the pages about Xbox 360, PlayStation 3, Nintendo Wii, PSP Vita. So, I want to rank my homepage and my internal pages about the consoles ranking for "xbox360", "play station 3" each one on a separate page of course. Basically I want to rank brands. My main questions are: 1. How much link builing should I do for my homepage considering that I'm not really interested in ranking it as much as the internal pages? In percentage how it would look like? Random (stupid) example: 60% links to homepage, 10% to each internal page? 2. I guess I must do links for internal pages otherwise they won't rank good, only linking to homepage. 3. Considering the penguin update, my main keyword should be around what % of the overall anchors to each internal page? Thank you very much for your help!
White Hat / Black Hat SEO | | corodan0 -
Geo-targeted Organic Search Traffic to a sub-domain
For a client of ours, we are likely to create a sub-domain that is to be targeted at a specific country. Most of the content on this sub-domain will be from the main site, although with some specific differentiation to suit that geographic market. We intend to tell Google through Webmaster Centre that the sub-domain is targeted at a specific country. Some questions: a) Any idea how long it could take before google gives precedence to the content in this sub-domain for queries originating from that particular country? b) What is the likely impact of content duplication ? What extent of differentiation is necessary from a search engine perspective? Thanks.
White Hat / Black Hat SEO | | ontarget-media1