Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Ranking issues for UK vs US spelling - advice please
-
Hi guys,
I'm reaching out here for what may seem to be a very simple and obvious issue, but not something I can find a good answer for.
We have a .com site hosted in Germany that serves our worldwide audience. The site is in English, but our business language is British (UK) English.
This means that we rank very well for (e.g.) optimisation software but optimization software is nowhere to be found.
The cause of this to me seems obvious; a robot reading those two phrases sees two distinct words. Nonetheless, having seen discussions of a similar nature around the use of plurals in keywords, it would seem to me that Google should have this sort of thing covered.
Am I right or wrong here?
If I'm wrong, then what are my options? I really don't want to have to make a copy of the entire site; apart from the additional effort involved in content upkeep I see this path fraught with duplicate content issues.
Any help is very much appreciated, thanks.
-
Hi Steven,
I'd have to agree with EGOL here - it is something that Google should have figured out - however, to some extent I think they have. For example, if I search for optimisation software here in the UK I get results for pages which are targeted to 'optimization' as well as 'optimisation'.
Whilst I'd guess that the UK spelling might not be helping you in this instance, I wonder if there might also be issues in terms of your site's authority or strength which are also causing you issues.
You've not said which site you're working on, but how does it stack up in terms of domain authority and page authority versus your SERP competition? It might be worth looking into that in the first instance.
The other alternative would be as EGOL highlighted - target 'optimization' instead.
It occurs to me that this might be worth testing - take your page as it is right now but switch out UK for US English and see if your rankings improve.
As you say you could create a US targeted subfolder on your site (I'd recommend implementing hreflang to avoid duplicate content issues) and try to tackle the issue that way - however if you're site isn't authoritative enough it likely still won't rank.
Hope this helps,
Hannah
-
Am I right or wrong here?
I agree. I think that Google has lots of problems here.
sulfur and sulphur have lots of problems
gray and grey have enormous problems
From what I see proper nouns for names of people, places and things cause a lot of the problems - but there are lots of webpages that seem to be overlooked or unreasonably ranked.
Any help.....
I don't have any help. I can only say that I am gunning for American English usage as that is where I think I will get the most traffic - and that is where my website is hosted.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
CcTLD vs subfolder for international SEO
In what situations is subfolder better than ccTLD, and vice versa.
International SEO | | MedicalSEOMarketing1 -
International SEO setup issues canonical URL
My site is www.grocare.com for one region and in.grocare.com for another region. Both of them have the same content except the currency for particular regions. Someone told me that google will take the content as duplicate and not rank either. I have setup hreflang and targeted different regions for both in the search console. I read many article which say canonical urls need to be setup for international seo sites. But Im not sure how to setup canonical urls and whether they are the right way to go . i just don't want my content deranked. Now i have setup hreflang properly after asking the moz community itself. So im hoping to get some help with this query too. TIA
International SEO | | grocare0 -
How do hreflang attributes affect ranking?
We have a site in English. We are considering translating the site into Dutch. If we use a hreflang attribute does that mean we have to create a duplicate page in Dutch for each English page, or does Google auto-translate? How would duplicate pages, even if they are in a different language, affect ranking?
International SEO | | Substance-create0 -
International SEO - Targeting US and UK markets
Hi folks, i have a client who is based in italy and they set up a site that sells travel experiences in the sout of Italy (the site currently sit on a server in Italy). The site has been set up as gTLDs: www.example.com They only want to target the US and the UK market to promote their travel experiences and the site has only the english version (the site does not currently offer an italian version). If they decide to go for the gTLDs and not actually change to a ccTLDs (which would be ideal from my point of view) how are the steps to be taken to set this up correctly on GSC? They currently only have one property registered on GSC: www.exapmple.com therefore i guess the next steps are: Add new property - www.example.com/uk and and set up geo targeting for UK Existing property - www.example.com/ set up geo targeting for US In case the client does not have the budget to optimise the content for american and british languages, would still make sense to have 2 separate property in GSC (example.com for US market and example.com/uk for UK market)? Few considerations: Add canonical tag to avoid duplicate content across the two versions of the site (in the event there is no budget to optimise the content for US and UK market)? Thank you all in advance for looking into this David
International SEO | | Davide19840 -
Issues with Baidu indexing
I have a few issues with one of my sites being indexed in Baidu and not too sure of how to resolve them; 1. Two subdomains were redirected to the root domain, but both (www. and another) subdomains are still indexed after ~4 months. 2. A development subdomain is indexed, despite no longer working (it was taken down a few months back). 3. There's conflicting information on what the best approach is to get HTTPS pages indexed in Baidu and we can't find a good solution. 4. There are hundreds of variations of the home page (and a few other pages) on the main site, where Baidu has indexed lots of parameters. There doesn't appear to be anywhere in their webmaster tools to stop that happening, unlike with Google. I'm not the one who deals directly with this site, but I believe that Baidu's equivalent of Webmaster Tools has been used where possible to correctly index the site. Has anyone else had similar issues and, if so, were you able to resolve them? Thanks
International SEO | | jobhuntinghq0 -
Google US vs Google UK
I could have posted this somewhere else, but I cannot find it. So, I have keywords that rank well in Google US and many that do well in Google UK too. I thought all of my keywords ranking well in the US would also rank well the UK. I have figured out today that it is not the case. Why would I rank in the top 3 in the US and not even show up in the top 50 in the UK? It is very strange. Thanks for your help! I am not super new to SEO or web business. I have had a very good company that has been ranking well since 2004.
International SEO | | journeybeyondtravel0 -
IP Redirection vs. cloaking: no clear directives from Google
Hi there, Here is our situation:we need to force an IP Redirection for our US users to www.domain.com and at the same time we have different country-specific subfolders with thei own language such as www.domain.com/fr. Our fear is that by forcing an IP redirection for US IP, we will prevent googlebot (which has an US IP) from crawling our country-specific subfolders. I didn't find any clear directives from Google representatives on that matter. In this video Matt Cutts says it's always better to show Googlebot the same content as your users http://www.youtube.com/watch?v=GFf1gwr6HJw&noredirect=1, but on the other hand in that other video he says "Google basically crawls from one IP address range worldwide because (they) have one index worldwide. (They) don't build different indices, one for each country". This seems a contradiction to me... Thank you for your help !! Matteo
International SEO | | H-FARM0