Subdirectories geo-targetting: Tagretting a language to a single country affects in other countries?
-
Hi all,
We have enabled a plugin which translated our site to subdirectories. We are going to geo target certain language sites to their countires. Like Portuguese site website.com/it/ will be targetted to rank in Portugal. I wonder what to do with language sites where the same langauge is spoken in multiple countries. For example, if we target english site website.com to US; will it effects the ranking in other english countires like UK, Australia, Canada, etc....?
Thanks
-
Few warnings to start with:
1. Automated translation is a horrible user experience. I highly recommend you wait on translation until you are in a financial place to get human translators.
2. Geo-targeting country=language is a false association. For languages like English, Spanish, French, and Portugese, there are multiple countries that use that language as their primary language.
I recommend using HREFLANG markup to tell search engines that there are different language versions of your content. That is all that is necessary here. https://moz.com/learn/seo/hreflang-tag
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does google takes to crawl a single site ?
lately i have been thinking , when a crawler visits an already visited site or indexed site, whats the duration of its scanning?
Algorithm Updates | | Sam09schulz0 -
Should plural keyword variations get their own targeted pages?
I am in the middle of changing a website from targeting just a single keyword on all pages to instead having each page target its own keyword/phrase. However, I'm a little conflicted on whether or not plural forms and other suffix (-ing) variations are different enough to get their own pages. SERP show different results for each keyword searched. Also, relevancy reports for the keywords score some differently and some the same. Is it best to instead use these as secondary and third level keywords on the same page as the main keyword for a page? See example below: OPTION A (Use each for different pages): Page 1 - Construction Fence Page 2 - Construction Fences Page 3 - Construction Fencing Page 4 - Construction Site Fence Page 5 - Construction Site Fences Page 6 - Construction Site Fencing ... OPTION B (Use as variations on same page): Page 1 - Construction Fence, Construction Fences, Construction Fencing Page 2 - Construction Site Fence, Construction Site Fences, Site Construction Fencing ... Any help is greatly appreciated. Thanks!
Algorithm Updates | | pac-cooper0 -
If Google doesn’t know we’re hosted in the UK, does that affect our SERPs?
Hi, In November 2011 our eCommerce website dropped from between 3rd and 4th position in the UK SERPs down to 7th and 8th. A year after this happened, we still haven’t moved back up to the original ranking despite all our best efforts and we’re looking for a bit of insight into what could have happened. One of our theories is this, do you think it might be the problem? In October 2011 we moved from a single-site custom built CMS hosted in the UK to a multi-site custom built CMS hosted on a much better server based in the UK. As part of this move we started using CloudFlare to help with security and performance (CloudFlare is a security CDN). Because CloudFlare’s servers are in the US, to the outside world it almost looks like we went from a slow hosting company in the UK to a much quicker hosting company in the US. Could this have affected our rankings? We know that Google takes the server IP address into account as a ranking factor, but as far as we understand it’s because they (rightly) believe that a server closer to the user will perform better. So a UK server will serve up pages quicker to a visitor in the UK than a US server because the data has a shorter distance to travel. However, we’re definitely not experiencing an issue with being recognised as a UK website. We have a .co.uk domain (which is obviously a big indicator) and if you click on “Pages from the UK” in the SERPs we jump up to 3rd place. So Google seems to know we’re a UK site. Is the fact we’re using CloudFlare and hence hiding our real server IP address – is this penalising us in the SERPs? Currently out of the 6 websites above us, 4 are in the US and 2 are in the UK. All of these are massive sites with lots of links, so smaller ranking factors might be more important for us. Obviously the big downside of not using CloudFlare is that our site becomes much less secure and it becomes much slower. Images and some static content is distributed via a local CloudFlare server, which means it should tick Google’s box in terms of providing a quick site for users. CloudFlare say in a blog post that they used to have Google crawl rates and geo-tagging issues in the past when they were just starting out, but in 2010 they started working with “the big search engines” to make sure they treated CloudFlare like a CDN (so special rules that apply to Akamai also apply to CloudFlare). Since they’ve been working with Google, CloudFlare say that their customers will only see a positive SEO impact. So at the moment we’re at a loss about what happened to our ranking. Google say they take IP’s into account for ranking, but by using CloudFlare it looks like we’re in the US. We definitely know we’re not having geo-tagging issues and CloudFlare say they’re working with Google to ensure its customers aren't seeing a negative impact by using CloudFlare, but a niggling part of us still wonders whether it could impact our SEO. Many thanks, James
Algorithm Updates | | OptiBacUK0 -
Does the use of an underscore in filenames adversely affect SEO
We have had a page which until recently was ranked first or second by Google UK and also worldwide for the term "Snowbee". It is now no longer in the top 50. I ran a page optimization report on the url and had a very good score. The only criticism was that I had used an atypical character in the url. The only unusual character was an underscore "_" We use the underscore in most file names without apparent problems with search engines. In fact they are automatically created in html files by our ecommerce software, and other pages do not seem to have been so adversely affected. Should we discontinue this practice? It will be difficult but I'm sure we can overcome this if this is the reason why Google has marked us down. I attach images of the SEO Report pages 8fDPi.jpg AdLIn.jpg
Algorithm Updates | | FFTCOUK0 -
How much posting product links to Social Media affect your ranking? Any use ?
Google has Google Plus. Facebook has partnership with Bing. How much social media affect your ranking ?
Algorithm Updates | | rahijain0 -
Does the browers type affect rankings?
This may be a rooky question so apologies in advance if it is! A client of mine has asked why his site's rank is different when he searches for it from his iPhone or computer (where he uses IE) and also on Bing. Obviously I know that there will be differences between Bing and Google so I can explain that to him. But he seems to be implying that the different browsers are affecting the results on his iPhone and computer. I've tried this myself using Firefox and IE and on Firefox the site ranks page 1 but on IE it ranks page 3 (both using Google). Is this likely to do with the browser having information about my past search habits or is it actually the browser affecting the SERP? Again, sorry if this is a stupid question! Thanks in advance.
Algorithm Updates | | WillCreate0 -
How do you properly target locally with anchor text?
I'm trying to figure out the best method for externally linkback anchors to my site for local results. What would be the best way to do this for some local SERP love: Cheeseburgers Chicago, IL Cheeseburgers Chicago Cheeseburgers Chicago illinois
Algorithm Updates | | Goetzman0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0