Hreflang doubt use correctly
-
Hello,I have a question, I want to know which option is best for implementing a multi languages. We have a client whose website will have English and Spanish languages, both languages have the same content but English we focus on the US and UK, and Spanish only for the country Spain, the question arises what is the correct nomenclature we use or would it be the best value.**Option 1:****Option 2:**Or any of the two options is correct What would be the correct ?. Another question, if a German user is in Spain, and do a search on (Google Spain), what will be the best option that should be implemented, / is-de / or single / de /, which one will position before ( provided that the statement I is correct). A greeting and thanks.
-
I would go with none of the above.
Your second option has "en" on the Spanish line by the way.
I would use this structure (and hreflang):
Also, once you're done test your code on Flang. And don't forget to do your inner pages as well.
-
You'll want the shortest, most technically feasible URL structure based on whatever website platform you're using. Subfolders like http://example.com/es/ and http://example.com/en/ are ideal, and there is a lot of talk about subdomain vs. subfolder on MOZ. Keep in mind you also have a http://example.com landing page, so you'll either need to redirect users or have visitors select a language on this home page. There is pretty thorough documentation here https://support.google.com/webmasters/answer/189077?hl=en on how to use the hreflang attribute in each case.
I leave my default language as the base URL and put additional languages in subfolders. Statistically, I tend to rank higher for keywords in my default language than for the translated keywords in additional languages. You might want to target the market with the most traffic or conversions (whatever metric you prioritize) with the default URL and then add additional languages as subfolders, preferably without hyphens or underscores in the locale code, i.e. en-us, en-uk, etc. This is more for your visitors and not a particular ranking factor, but shorter domains are preferable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using the same image across the site?
Hi just wondering i'm using the same image across 20 pages which are optimized for SEO purposes. I was wondering is there issues with this from SEO standpoint? Will Google devalue the page because the same image is being used? Cheers.
Intermediate & Advanced SEO | | seowork2140 -
Strategies for best use of competitors expired domain
I recently bought an old competitors expired domain that was ranking around the page 2 or 3 on Google for most keywords that I target. Curious as to best strategy for utilizing this domain: 1. set up some content with back links to my own domain
Intermediate & Advanced SEO | | IsaCleanse
2. Set up redirects to set up all of the competitors old domain URLs to corresponding sections on my website
3. Something else?0 -
Original Site content was used for submission to article directories
I had a communication problem with my writer and she used original unspun content and posted it to Unique Article Wizard. So all UAW does is take each paragraph and mix them up. So I searched a sentence on my site where the content came from and got back a bunch of returns for that sentence. My site wasn't the first result returned. I"m wondering how bad that is going to be for me. The links from UAW are going back to an anchor layer that then links back to this site. Can anyone tell me if I need to rewrite the content on the original site? That is the only way I can think to make that not an issue. Thanks
Intermediate & Advanced SEO | | mtking.us_gmail.com0 -
Doubt of multi country/language site
Hi ! We are building a site that is going to be available in some countrys with the same language (spanish), and we have some doubts about whih is the best way to do it. Option 1) Subdomains: Example; españa.mydomain.com , mexico.mydomain.com (the problem here is that there are some problems with linkbuilding with subdomains) Option 2) Language folders: Example; mydomain.com/es/es mydomain.com/es/mx (the problem here is that the prestige of the category in the url is going to be in 3rd position, example: mydomain.com/es/es/category and is not recommended for SEO) Option 3) Country domains Example; mydomain.es<a></a> mydomain.mx (the link building is going to be much more, cause we have to multipliate the links that we need ffor being in a good position with the diferent domains of each country) I am not sure of which one is the best option, what do you think? The only thing I am sure is to use te TAG: rel="alternate" hreflang="x" for not having duplicate content, because index and categories are going to be the same, the only thing that is going to change is the products of each country. Looking forward to your suggestions! Thanks, Regards Exequiel
Intermediate & Advanced SEO | | SeoExpertos0 -
Duplicate content: is it possible to write a page, delete it and use it for a different site?
Hi, I've a simple question. Some time ago I built a site and added pages to it. I have found out that the site was penalized by Google and I have neglected it. The problem is that I had written well-optimized pages on that site, which I would like to use on another website. Thus, my question is: if I delete a page I had written on site 1, can use it on page 2 without being penalized by Google due to duplicate content? Please note: site one would still be online. I will simply delete some pages and use them on site 2. Thank you.
Intermediate & Advanced SEO | | salvyy0 -
Is This 301 Use Best Practice??
I know its effective practice cuz we're getting our arse kicked. I'm curious if its best practice (white, gray or black hat). I'm checking a competitors link profile on its landing page that is hitting the top of page 1 for several keywords. This competitor (national chain) has a strong domain authority (69). The particular landing page I'm checking in OSE has two 301 redirects from its own site among some other directory links to the page. The page shows 15 external links and half of them are very strong including it's own 301's. Aren't they essentially sending their own juice to the landing page to bolster page/domain authority to rank higher in the SERPS for those keywords? Is this a common practice using the 301's to a landing page? Is it white, gray or black hat? They are appearing suddenly appearing on the first page for several category keywords, so we're doing some snooping. Thanks.
Intermediate & Advanced SEO | | AWCthreads0 -
Will using a service such as Akamai impact on rankings?
Howdy 🙂 My client has a .com site they are looking at hosting via Akamai - they have offices in various locations, e.g UK, US, AU, RU & in some Asian countries. If they used Akamai, would the best approach be to set up seperate sites per country: .co.uk .com .com.au .ru .sg etc Although my understanding is that Googlebot is located in the US so if it crawled any of those sites it would always get a US IP address? So is the answer perhaps to go with Akamai for the .com only which should target the US market and use different / seperate C class hosts for the others? Thanks! Woj
Intermediate & Advanced SEO | | wojkwasi0 -
Use rel=canonical to save otherwise squandered link juice?
Oftentimes my site has content which I'm not really interested in having included in search engine results. Examples might be a "view cart" or "checkout" page, or old products in the catalog that are no longer available in our system. In the past, I'd blocked those pages from being indexed by using robots.txt or nofollowed links. However, it seems like there is potential link juice that's being lost by removing these from search engine indexes. What if, instead of keeping these pages out of the index completely, I use to reference the home page (http://www.mydomain.com) of the business? That way, even if the pages I don't care about accumulate a few links around the Internet, I'll be capturing the link juice behind the scenes without impacting the customer experience as they browse our site. Is there any downside of doing this, or am I missing any potential reasons why this wouldn't work as expected?
Intermediate & Advanced SEO | | cadenzajon1