Managing international sites, best practises
-
This question follows on from my earlier question http://www.seomoz.org/q/how-to-replace-my-co-uk-site-with-my-com-site-in-the-us-google-results
My client owns www.blindbolt.co.uk for the UK site and www.blindboltusa.com for their US site. They will shortly be having a new site for Australia.
They have just acquired www.blindbolt.com and have expressed an interest in using this as the main hub for all of their sites, i.e. http://uk.blindbolt.com, http://aus.blindbolt.com.
The current, existing sites (e.g. www.blindbolt.co.uk) could be 301'd to the new locations.
Could I have your thoughts please on whether to go down this route of having international subdomains , vs keeping the sites on separate top level domains? What should I take into consideration? Is google smart enough to return different subdomain results in different countries?
Many thanks!
-
It's really tough to predict. I think Eyepaq covered the basics well - you'll consolidate your link-juice, but you may harm your UK-specific ranking slightly. Whether the consolidation offsets the loss really depends a lot on your link profile, how Google treats your UK site (sometimes, English content, even across countries, is seen as partially duplicated), and your market focus.
If 80%+ of your market is US-based, you're not ranking that well in the UK, and you don't have the resources to really push two domains, my gut reaction would be to favor consolidation. If you have two separate marketing efforts int he two countries and half or more of your sales are UK-based, then you'd be taking a real risk.
I would check out the newish rel="alternate" hreflang="..." option:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
It's specifically for helping same-language content rank correctly across regions, and Google seems to be pushing it publicly.
-
Recently there was a blog posted on the Youmoz Blog which is worth reading. I think the main idea of this post is that the best strategy for international SEO depends heavily on the company, and the resources they are willing to invest.
The main distintion is made between:
subfolders of one international domain (.com/us/ .com/au/ etc.):
This is the easiest way to maintain different websites. (low cost, easy to use). However, you will also have the most difficulties ranking the right page in the right country.
seperate country specific domains (.co.uk .au .com)
If there are resources available for maintaining separate websites, building domain authority and creating content for all of them, this would be the best option.
subdomains (au.blindbolt.com etc.)
This is the middle way. Harder to maintain than subfolders but not neccissarily more expensive.
I'd advise you to read the blog carefully and also to study hard on best practices in international SEO, since a lot of people have difficulties ranking the right pages in the right countries.
-
Thanks very much eyepaq, some very interesting points.
Thanks for taking the time to reply.
-
Hi,
You can select in Google Webmster tool a geo location for each subdomain. (e.g.. for uk.blindbolt.com you can select Uk and so on.) . Google will know and it will act accordantly with your settings - I've tested this personally and it works well.
However, a dot com domain with a subdomain even if targeted as geo location with Web master tools won't behave as good as a dot co.uk for UK. Also CTR in serps will also be affected as especially in Asutralia a co.au domains ill get more clicks just because of the domain type - dedicated for Asutralia.
Another reason for not going with subdomain is that even if you 301 www.blindbolt.co.uk to the subdomain for dot com you will still lose some link juice and you will have some down times as far as traffic levels.
The positive thing about it, but with a gambling twist is that if google won't treat the subdomains as separate domains (that can happen based on user behavior and additional factors) links to the dot com domain will affect in a positive manner the entire domain and so all countries.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub domain? Micro site? What's the best solution?
My client currently has two websites to promote their art galleries in different parts of the country. They have bought a new domain (let's call it buyart.com) which they would eventually like to use as an e-commerce platform. They are wondering whether they keep their existing two gallery websites (non e-commerce) separate as they always have been, or somehow combine these into the new domain and have one overarching brand (buyart.com). I've read a bit on subdomains and microsites but am unsure at this stage what the best option would be, and what the pros and cons are. My feeling is to bring it all together under buyart.com so everything is in one place and creates a better user journey for anyone who would like to visit. Thoughts?
Technical SEO | | WhitewallGlasgow0 -
Messy older site
I am taking over a website that doesn't have any canonical tags and spotty redirects. It looks like they have http://, https://, www and non-www pages indexed but GA is just set up for the http://non-www home page. Should all versions of the site be set up in GA and Search Console? I think so but wanted to confirm. Thanks in advance.
Technical SEO | | SpodekandCo0 -
Site Hack In Meta Description
Hey MOZ Community, I am looking for some help in identifying where the following meta description is coming from on this home page - https://www.apins.com. I have scrubbed through the page source without being able to locate where the content is being pulled from. The website is built on WordPress and metas were updated using Yoast, but I am wondering if an installed plugin could be the culprit. On top of this, I have had a developer take a look for the "hack" and they have assured that the issue has been removed. I have submitted the URL in GSC a couple of times to be re-indexed but have not had much luck. Any thoughts would be much appreciated, the displayed description is below. The health screening plays http://buyviagraonlineccm.com/ a significant and key role in detecting potentially life-threatening illnesses such as cancer, heart ...
Technical SEO | | jordankremer0 -
Dynamic Url best approach
Hi We are currently in the process of making changes to our travel site where by if someone does a search this information can be stored and also if the user needs to can take the URL and paste into their browser at find that search again. The url will be dynamic for every search, so in order to stop duplicate content I wanted ask what would be the best approach to create the URLS. ** An example of the URL is: ** package-search/holidays/hotelFilters/?depart=LGW&arrival=BJV&sdate=20150812&edate=20150819&adult=2&child=0&infant=0&fsearch=first&directf=false&nights=7&tsdate=&rooms=1&r1a=2&r1c=0&r1i=0&&dest=3&desid=1&rating=&htype=all&btype=all&filter=no&page=1 I wanted to know if people have previous experience in something like this and what would be the best option for SEO. Will we need to create the URL with a # ( As i read this stops google crawling after the #) Block the folder IN ROBOTS is there any other areas I should be aware of in order stop any duplicate content and 404 pages once the URL/HOLIDAY SEARCH is no longer valid. thanks E
Technical SEO | | Direct_Ram0 -
Mobile site not ranking
Hello, I have a m.site.com version of my original site. It is about 1/10 the size, and no matter what I do-I can't get the site to rank. I've added more pages and specified canonical etc etc. Should I add as many pages as my larger site has? Are there specific places I should be submitting this version beyond the typical? I am at a loss, so any help would be greatly appreciated! Thanks! L
Technical SEO | | lfrazer1 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
What is the best strategy for franchise companies when building local sites?
Hi, if you represent a national franchise, I have noticed that Dominos and others do NOT use local websites for LOCAL SEO, rather they use their own MAMMOTH sites with a store locator for the local stores with a few NOT keyword rich pages with very basic information. However, for LOCAL SEO, I have been thinking that using e.g. Hyperfranchise.com for the main domain and then e.g. buckhead.hyperfranchise.com or buckheadhyperfrnachise.com would be better for LOCAL SEO including Yelp, FourSquare and more.It will take time to rank for all local sites, but is that not better in the end than having e.g. 6 pages of content that are "local" on the main site? However, I have not seen any of the big ones do that, but that might be because they are so entrenched in their own OLD system that might be ranking well anyway for their local franchisees? Any comments, ideas, suggestions?
Technical SEO | | yvonneq0 -
Site not indexing correctly
I am trying to figure out what is going on with my site listings. Google is only displaying my title and url - no description. You can see it when you search for Franchises for Sale. The site is www.franchisesolutions.com. Why could this happen? Also I saw a big drop off in a handful of keyword rankings today. Could this be related?
Technical SEO | | franchisesolutions0