I am launching an international site. what is the best domain strategy
-
Hi Guys,
I am launching a site across the US, UK and UAE. Do I go **test.com/uk test.com/us test.com/UAE -- **or do I go us.Test.com UAe.test.com us.test.com? Which is best for SEO?
-
Suggesting subfolders without considering the peculiarities of the site jusrt because it's a "best practice" is not the ideal thing to do. Even if subfolders have sure advantages, their use can be not technically affordable or even good for effectiveness.
-
As always, it depends.
Is your site an ecommerce site with thousand hundreds if not more products. Or a news site? Then, maybe, the best thing to do should be using country code level domains, because - apart better for geotargeting - the technical maintaining of three complete ecommerce/news sites under in a subfolder system is not the most agile (especially if the site is custom made).
If the case if the one described above, but the ccTLDs are not available, then subdomains can be an alternative.
If your site is not an very technically complex ecommerce or news site, then use subfolders, but consider, if you see that one of the subfolders has very good metrics (sessions, conversions) to move it to a ccTLDs in the middle/long term.
-
There are positives and negatives to using different strategies. Moz's education section has three articles on international SEO: International SEO, Hreflang, and CcTLDs. I'd suggest going through them and also reading any further resources that they cite.
Good luck!
-
Also, if we want to target GCC countries with arabic content, what domain strategy should we apply..
We already have www.tcf-me.com (for english) and www.tcf-me.ae for arabic (But now we not only want to target UAE (Ae) but the entire GCC..
-
Figen we are in the same situation asking the same questions..
How do you tackle duplicate content?
-
Use sub domain (Etc. us.domain.com - fr.domain.com - de.domain.com) if your site will be different languages .
if your site will be only english use this : domain/us - domain/uk .
Be careful about duplicate content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site build in the 80% of canonical URLs - What is the impact on visibility?
Hey Everyone, I represent international wall decorations store where customer can freely choose a pattern to be printed on a given material among a few milions of patterns. Due to extreme large number of potential URL combinations we struggle with too many URL adressess for a months now (search console notifications). So we finally decided to reduce amount of products with canonical tag. Basing on users behavior, our business needs and monthly search volume data we selected 8 most representative out of 40 product categories and made them canonical toward the rest. For example: If we chose 'Canvas prints' as our main product category, then every 'Framed canvas' product URL points rel=canonical tag toward its equivalent URL within 'Canvas prints' category. We applied the same logic to other categories (so "Vinyl wall mural - Wild horses running" URL points rel=canonical tag to "Wall mural - Wild horses running" URL, etc). In terms of Googlebot interpretation, there are really tiny differences between those Product URLs, so merging them with rel=canonical seems like a valid use. But we need to keep those canonicalised URLs for users needs, so we can`t remove them from a store as well as noindex does not seem like an good option. However we`re concerned about our SEO visibility - if we make those changes, our site will consist of ~80% canonical URLs (47,5/60 millions). Regarding your experience, do you have advices how should we handle that issue? Regards
White Hat / Black Hat SEO | | _JediMindBender
JMB0 -
One page sites
HI Guys, I need help with a one page site What is the best method to getting the lower pages indexed? Linking back to the site(Deeplinking) is looking impossible. Will this hurt my SEO? Are there any other tips on one page websites that you can recommend?
White Hat / Black Hat SEO | | Johnny_AppleSeed0 -
Whether to use new domain or old ecommerce site domain that has been incomplete for a long time.
Hello, We are starting a second store in our niche. Which of the following should I choose: A. We have a site from a year and a half ago that we put content on but never actually added products. The category and article content needs to be completely rewritten. We will completely rewrite the content to be much better and up to date. We're planning on adding products and rewriting the manufacturer descriptions. B. We could use a new domain that is closer to exact match for our main keyword. We'd just buy one for $15 I don't know whether A or B would be the fastest way to get the site going. I'm concerned that leaving a site half done for a year could cause an issue, but I really don't know. If you've got experience with this, please advise. Thank you.
White Hat / Black Hat SEO | | BobGW0 -
International web site - duplicate content?
I am looking at a site offering different language options via a javascript drop down chooser. Will google flag this as duplicate content? Should I recommend the purchase of individual domains for each country? i.e. .uk
White Hat / Black Hat SEO | | bakergraphix_yahoo.com1 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Failed microsites that negatively affect main site: should I just redirect them all?
While they are great domain names, I suspect my 7 microsites are considered spammy and resulted in a filter on my main e-commerce site for the important keywords we now have a filter blocking from showing up in search. Should I consider it a sunk cost and redirect them all to my main e-commerce site, or is there any reason why that would make things worse? I've fixed just about everything I can thinking of in response to Panda and Penguin, before which we were on the first page for everything. That includes adding hundreds of pages of unique and relevant content, in the form of buyers guides and on e-commerce category pages -- resolving issues of thin content. Then I hid URL parameters in Ajax, sped up the site significantly, started generating new links... nothing... I have tons of new keywords for other categories, but I still clearly have that filter on those few important head keywords. The anchor text on the microsites leading to the main site are typically not exact match, so I don't think that's the issue. It has to be that the sites themselves are considered spammy. My bosses are not going to like the idea because they paid for those awesome domains, but would the best idea be to redirect them to the e-commerce site?
White Hat / Black Hat SEO | | ElBo9130 -
HELP - Site architecture of E-Commerce Mega Menu - Linkjuice flow
Hi everyone, I hope you have a couple of mins to give me your opinion. Ecommerce site has around 2000 products, in english and spanish, and around only 70 hits per day if that. We have done a lot of optimisation on the site - Page Titles, URL's, Content, H1's, etc.... Everything on page is pretty much under control, except I am starting to realise the site architecture could be harming our SEO efforts. Once someone arrives on site they are language detected and do a 302 to either domain.com/EN or domain.com/ES depending on their preferred language. Then on the homepage, we have the big MEGA MENU - and we have
White Hat / Black Hat SEO | | bjs2010
CAT 1
SubCat 1
SubsubCat 1
SubsubCat 2
SubsubCat 3 Overall, there are 145 "categories". Plus links to some CMS pages, like Home, Delivery terms, etc... Each Main Category, contains the products of everything related to that category - so for example:
KITCHENWARE
COOKWARE BAKINWARE
SAUCEPANS BOWLS
FRYING PANS Kitchenware contains: ALL PRODUCTS OF SUBCATS BELOW, SO COOKWARE ITEMS, SAUCEPANS, FRYING PANS, BAKINGWARE, etc... plus links to those categories through breadcrumbs and a left hand nav in addition to the mega menu above. So once the bots hit the site, immediately they have this structure to deal with. Here is what stats look like:
Domain Authority: 18 www.domain.com/EN/
PA: 27
mR: 3.99
mT: 4.90 www.domain.com/EN/CAT 1
PA: 15
mR: 3.05
mT: 4.54 www.domain.com/EN/CAT 1/SUBCAT1
PA: 15
mR: 3.05
mT: 4.54 Product pages themselves - have a PA of 1 and no mR or mT. I really need some other opinions here - I am thinking of: Removing links in Nav menu so it only contains CAT1 and SUBCAT1 but DELETE SUBSUBCATS1 which represent around 80 links Remove products within the CAT1 page - eg., the CAT 1 would "tile" graphical links to subcategories, but not display products themselves. So products are only available right at the lowest part of the chain (which will be shortened) But I am willing to hear any other ideas please - maybe another alternative is to start building links to boost DA and linkjuice? Thanks all, Ben0 -
One good domain generating to much links what to do
I think penguin had no effect yet on spain. propdental.com remain the same.And propdental.es still growing.No penguin 2.0 effect. I think it will need a few more days to see if there is impact on spain.
White Hat / Black Hat SEO | | maestrosonrisas
Althought i have a question regarding coagnitive SEO, (is regarding a link to propdental.es from unidirectorio.com) i think is a good web, but as generated me an very big amount of links)i have this on link from unidirectorio.com that has generated 2400 links to www.propdental.es with this ancor text "clinica dental con dentistas especialistas en implantes dentales ortodoncia invisalign y carillas" Links is comes from this page http://undirectorio.com/Salud/dentistas/ and then generates 2400I can not remove this link. I seemed a good directory with just 3 pages linking out and good page rank on my specific field.I ask google to dont take that link into account, although i am not sure if i did it well.**Can someone tell me how to say to google to dont take in account the links from a domain?**google still shows this link on webmaster tools, i am afraid it ends up been bad. I seems a good directory is not an exact ancor text although containt all work i want to rank.What would be your advice? Do i have any way to make sure that google does not have the links recieved from that domain into account0