Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best practice for URL - Language/country
-
Hi,
We are planning on having our website localized into more languages. We already have an English and German version. The German version is currently a sub-domain:
www.example.com --> English version
de.example.com --> German version
Is this recommended? Or is it always better to have URLs with language prefixes such a:
Which is a better practice in terms of SEO?
-
Hi Peter,
Both really good answers to your questions above but maybe it would be good to give you some further pointing in the right direction. Perhaps you could answer the questions below and I can give you my personal opinion on which method would be best:
-
will you be putting an equal amount of marketing (content, PR, etc.) into the Spanish version for example compared with English?
-
are you able to offer fully localised service eg, Spanish customer service, Spanish sales team etc.?
-
is your company well-known globally?
It's important not to also forget that another option is using ccTLDs (eg, .co.uk, .com.au). These give the highest signal to search engines about the country being targeted and also importantly make you look more "local" which can do wonders for increasing conversion rate in countries where your company is not well-known.
-
-
I think that Tom gave you one of the best answers possible.
However I hope this helps your site structure should be very similar to one contained in the two URL's
If I may add a little bit of information that I thought was helpful
- https://support.google.com/webmasters/answer/189077?hl=en
- https://www.deepcrawl.com/knowledge/best-practice/hreflang-101-how-to-avoid-international-duplication/
WHERE TO ADD YOUR HREFLANG TAGS
You can add hreflang tags to your sitemaps, in the HTTP response headers, or on the page itself.
IN YOUR SITEMAPS
The best place to add hreflang is in your sitemap as including them in the headers or on the page adds weight to every single page request.
The following example will inform Google about the English version from the German version of the website:
<url> <loc>http://www.example.com/deutsch/</loc></url>
<xhtml:link< span=""> rel=”alternate” hreflang=”en” href=”http://www.example.com/english/” /> <xhtml:link < span="">rel=”alternate” hreflang=”de” href=”http://www.example.com/deutsch/” /></xhtml:link <></xhtml:link<>
This method would need to be repeated in full for every page on the site and for all the international websites.
IN YOUR HEADERS AND HTML
Hreflang tags can also be added to the HTTP header:
Link: http://www.example.com/english/; rel=”alternate”; hreflang=”en” Link: http://www.example.com/deutsch/; rel=”alternate”; hreflang=”de”
Or in the tag in the HTML:
http://www.example.com/english/” /> http://www.example.com/deutsch/
& because you will be creating a new site
https://www.candidsky.com/blog/the-seo-2015-guide-to-website-migration/
it would come down to your backlink profile if it were me I would use
Moz open site Explorer, Majestic, Ahrefs and Google Webmaster tools to determine whether or not I will be receiving a enough Backlinks for a subdomain or separate TLD otherwise I would use a subfolder and an extremely fast method of hosting the site Fastly is excellent or many other great methods as well.
Hope this helps,
Tom
PS use
http://hreflang.ninja/ to check
-
Hi Peter
Both are viable options.
I'd highly recommend going through Aleyda Solis' international SEO posts here on the Moz blog. They can teach how to prepare for international SEO, how to approach site structure and how to generate relevant code and hreflang tags.
Here is her international SEO checklist
Here is her Hreflang blog post and generator tool
And 40 tools to help advance your international SEO
They're great reading and nothing that I'd be able to do add to, so I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Exact Match Domain & Title Tag / URL
I currently own an exact match domain for my keyword. I have it set up with multiple pages and also a blog. The home page essentially serves as a hub and contains links to all the pages and the blog. My targeted keyword is on its own page and I made the title tag the same as my keyword. As an example the URL for my targeted post looks like this: benefitsofrunningshoes.com/benefits-of-running-shoes I have solid, non-spammy content and clean whitehat earned backlinks directing to that specific page. My concern right now is that the URL looks kinda spammy. The website has been live for about a week and the home page ranks well enough but my targeted page is no where to be found. (it does show up if I manually search via search command "site:benefitsofrunningshoes.com"). I'm wondering if it is acceptable to use the exact keyword in title tag / page url if it is also in the domain as an EMD? Should I change the title tag and leave the URL in? Or should I completely change the title tag and URL and 301 redirect to the new page? I appreciate any help!
Technical SEO | | Kusanagi170 -
Is it good practice to still pay for Best of the Web Directory (BOTW) and other similar one's you have to pay for?
I know that paid for links are hit by Google, but in the past these directories were okay. What about now? Thank you.
Technical SEO | | RoxBrock0 -
<sub>& <sup>tags, any SEO issues?</sup></sub>
Hi - the content on our corporate website is pretty technical, and we include chemical element codes in the text that users would search on (like S02, C02, etc.) A lot of times our engineers request that we list the codes correctly, with a <sub>on the last number. Question - does adding this code into the keyword affect SEO? The code would look like SO<sub>2</sub>.</sub> Thanks.
Technical SEO | | Jenny10 -
Can you have a /sitemap.xml and /sitemap.html on the same site?
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?
Technical SEO | | PioneerServices0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
Google Off/On Tags
I came across this article about telling google not to crawl a portion of a webpage, but I never hear anyone in the SEO community talk about them. http://perishablepress.com/press/2009/08/23/tell-google-to-not-index-certain-parts-of-your-page/ Does anyone use these and find them to be effective? If not, how do you suggest noindexing/canonicalizing a portion of a page to avoid duplicate content that shows up on multiple pages?
Technical SEO | | Hakkasan1