Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
US domain pages showing up in Google UK SERP
-
Hi,
Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au)
Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental.
However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones.
Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue?
Thanks in advance,
R
-
As your own agency told, I too consider that when the hreflang will be implemented, this kind of issues should terminate.
Regarding the sitemap error, it was surely something that could be confusing Google about what site to target.
However, I see that you have also an .eu domain name...
I imagine that that domain is meant for targeting the European market and I suspect that it is in English.
If it is so, remember:
- In countries like Spain, France, Germany, italy... we don't search in Internet using English, but Spanish, French, German, Italian... Therefore, that .eu domain is not going to offer you those results you maybe are looking for;
- The .eu domain termination is a generic one, and cannot be geotargeted via Google Search Console. This means that - by default - it targets all the world, hence, you probably can see visits from English speaking users in countries like South Africa, UK, IE, Australia, New Zealand or India, where English is the main language or one of the official ones;
- When it comes to domains like .eu and hreflang, it is always hard to decide how to implement it. In your specific case, as you are targeting UK, US, AU and IE with specific domain names, the ideal would be to implement this hreflang annotation for the .eu (the example is only for the home page):
<rel="alternate" href="http://www.domain.eu" hreflang="x-default"><rel="alternate" href="http://www.domain.eu" hreflang="en"><rel="alternate" href="http://www.domain.com" hreflang="en-GB"><rel="alternate" href="http://www.domain.us" hreflang="en-US"><rel="alternate" href="http://www.domain.com.au" hreflang="en-AU"></rel="alternate"></rel="alternate"></rel="alternate"></rel="alternate"></rel="alternate">
With those annotations, you are telling Google to show the .com to users in Great Britain, the .us to users in United States, the .au to Australian ones and the .eu to all the other users using English in any other country.
That will mean that your .eu site surely will target also users in others European countries, both using english when searching (hreflang="en") and other languages (hreflang="x-default").
2 notes about the hreflang="x-default":
-
People living in the UK and searching in Spanish will see the .eu domain name, because it is the default domain name for searches in every language but English in GB, IE, AU and US;
-
Again, even if you pretend the .eu domain to target only European countries, that is impossible, because the .eu termination doesn't have any geotargeting power (and regions like Europe or Asia cannot be geotargeted via GSC). So it will be normal to see visit also from countries in others continents.
-
You're very welcome. Either way I'd be interested to see how this one progresses.
-
Hi Chris,
Thanks for your quick response and detailing out this well.
I have backdated and noticed that this occurs almost every six months. The US domain urls pop up in the UK SERPs for about 2 weeks and disappear after that. We are yet to implement the href lang tags on site and our SEO agency confirm that this should fix the issue.
Will keep this thread updated on the outcome.
Cheers,
RG
-
Whether or not this is an issue kind of depends on what your product or service is. If you provide a local-only service like a restaurant then your US site ranking in the UK would be unusual.
On the other hand, if you sell a physical product this may not be so unusual. For example, here in Australia we're quite limited when it comes to finding men's online clothing stores, most of it comes from the US or the UK so it's not uncommon to see something like the US Jackthreads show up in the SERPs here.
Since you do have separate domains for each location, this might be an indication that search engines aren't really understanding the different jurisdictions for each site; maybe they're not geo-targeted enough for the algorithm to comprehend the fact that each of the 3 sites server a unique area.
Some of the elements that can help define this, in no particular order:
- Server location
- HTML language ( e.g. lang="en-US")
- Regional language differences (e.g. US spelling vs UK)
- Location markup - on your location pages at the very least
- Location mentions throughout your content
While not specifically on-topic, Rand's Whiteboard Friday about scaling geo-targeting offers plenty of great advice that can be applied here as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Directories and Domain Authority
I read all the time about how directories have very little weight in SEO anymore, but in my field, a lot of our competitors are propped up by paying for "profiles" aka links from places like martindale-hubbard, superlawyers, findlaw, nolo, Avvo, etc (which are essentially directories IMO) yet all those sites have very high DAs of 80 and above. So, are links from these sites worth it? I know that's a vague questions, but if Moz's algo seems to rank them so highly, I'm guessing that's reasonably close to what google thinks as well...maybe? Thanks for any insight, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Does a KML file have to be indexed by Google?
I'm currently using the Yoast Local SEO plugin for WordPress to generate my KML file which is linked to from the GeoSitemap. Check it out http://www.holycitycatering.com/sitemap_index.xml. A competitor of mine just told me that this isn't correct and that the link to the KML should be a downloadable file that's indexed in Google. This is the opposite of what Yoast is saying... "He's wrong. 🙂 And the KML isn't a file, it's being rendered. You wouldn't want it to be indexed anyway, you just want Google to find the information in there. What is the best way to create a KML? Should it be indexed?
Algorithm Updates | | projectassistant1 -
How long does it take for a new website to start showing in the SERP'S
I launched my website about 6 weeks ago. It was indexed fairly quickly. But it is not showing up in the Google SERP. I did do the on page SEO and followed the best practise's for my website. I have also been checking webmaster tools and it tells me that there is no errors with my site. I also ran it through the seomoz on page seo analyzer and again no real big issues. According to seomoz I had 1 duplicate content issue with my blog posts, which i corrected. I understand it takes some time, but any ideas of how much time? And f.y.i it's a Canadian website. So it should be a lot easier to rank as well. Could my site be caught in the Google 'sandbox effect' ? Any thoughts on this would be greatly appreciated.
Algorithm Updates | | CurtCarroll0 -
Confused About Addon Domains and SEO
I find addon domains really confusing. Everyone I've asked so far says that they don't affect SEO but I find that really hard to believe considering the same content is on both a subdomain and a subfolder and also has it's own unique domain. PLUS (in my case) completely different niche sites are sharing the same hosting. I really don't want to pay for hosting for all of my different sites but at the same time, if it's better/safer to do so for Panda/Penguin reasons I'm happy to do that. Thank you for your time. I look forward to your opinions/suggestions!
Algorithm Updates | | annasusmiles0 -
Does google index non-public pages ie. members logged in page
hi, I was trying to locate resources on the topics regarding how much the google bot indexes in order to qualify a 'good' site on their engine. For example, our site has many pages that are associated with logged in users and not available to the public until they acquire a login username and password. Although those pages show up in google analytics, they should not be made public in the google index which is what happens. In light of Google trying to qualify a site according to how 'engaged' a user is on the site, I would feel that the activities on those member pages are very important. Can anyone offer suggestions on how Google treats those pages since we are planning to do further SEO optimization of those pages. Thanks
Algorithm Updates | | jumpdates0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
Stars (Votes) in SERPS
Anyone seen these before? Small stars representing 'votes'? What's all this about?
Algorithm Updates | | MirandaP0 -
How long does a news article stay on Google's 'News' section on the SERP?
Our site is recognised as a news source for our niche - was just wondering if anyone had any idea how long the news story stays on the front page of the SERP once Google picks it up?
Algorithm Updates | | DanHill0