Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
US domain pages showing up in Google UK SERP
-
Hi,
Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au)
Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental.
However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones.
Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue?
Thanks in advance,
R
-
As your own agency told, I too consider that when the hreflang will be implemented, this kind of issues should terminate.
Regarding the sitemap error, it was surely something that could be confusing Google about what site to target.
However, I see that you have also an .eu domain name...
I imagine that that domain is meant for targeting the European market and I suspect that it is in English.
If it is so, remember:
- In countries like Spain, France, Germany, italy... we don't search in Internet using English, but Spanish, French, German, Italian... Therefore, that .eu domain is not going to offer you those results you maybe are looking for;
- The .eu domain termination is a generic one, and cannot be geotargeted via Google Search Console. This means that - by default - it targets all the world, hence, you probably can see visits from English speaking users in countries like South Africa, UK, IE, Australia, New Zealand or India, where English is the main language or one of the official ones;
- When it comes to domains like .eu and hreflang, it is always hard to decide how to implement it. In your specific case, as you are targeting UK, US, AU and IE with specific domain names, the ideal would be to implement this hreflang annotation for the .eu (the example is only for the home page):
<rel="alternate" href="http://www.domain.eu" hreflang="x-default"><rel="alternate" href="http://www.domain.eu" hreflang="en"><rel="alternate" href="http://www.domain.com" hreflang="en-GB"><rel="alternate" href="http://www.domain.us" hreflang="en-US"><rel="alternate" href="http://www.domain.com.au" hreflang="en-AU"></rel="alternate"></rel="alternate"></rel="alternate"></rel="alternate"></rel="alternate">
With those annotations, you are telling Google to show the .com to users in Great Britain, the .us to users in United States, the .au to Australian ones and the .eu to all the other users using English in any other country.
That will mean that your .eu site surely will target also users in others European countries, both using english when searching (hreflang="en") and other languages (hreflang="x-default").
2 notes about the hreflang="x-default":
-
People living in the UK and searching in Spanish will see the .eu domain name, because it is the default domain name for searches in every language but English in GB, IE, AU and US;
-
Again, even if you pretend the .eu domain to target only European countries, that is impossible, because the .eu termination doesn't have any geotargeting power (and regions like Europe or Asia cannot be geotargeted via GSC). So it will be normal to see visit also from countries in others continents.
-
You're very welcome. Either way I'd be interested to see how this one progresses.
-
Hi Chris,
Thanks for your quick response and detailing out this well.
I have backdated and noticed that this occurs almost every six months. The US domain urls pop up in the UK SERPs for about 2 weeks and disappear after that. We are yet to implement the href lang tags on site and our SEO agency confirm that this should fix the issue.
Will keep this thread updated on the outcome.
Cheers,
RG
-
Whether or not this is an issue kind of depends on what your product or service is. If you provide a local-only service like a restaurant then your US site ranking in the UK would be unusual.
On the other hand, if you sell a physical product this may not be so unusual. For example, here in Australia we're quite limited when it comes to finding men's online clothing stores, most of it comes from the US or the UK so it's not uncommon to see something like the US Jackthreads show up in the SERPs here.
Since you do have separate domains for each location, this might be an indication that search engines aren't really understanding the different jurisdictions for each site; maybe they're not geo-targeted enough for the algorithm to comprehend the fact that each of the 3 sites server a unique area.
Some of the elements that can help define this, in no particular order:
- Server location
- HTML language ( e.g. lang="en-US")
- Regional language differences (e.g. US spelling vs UK)
- Location markup - on your location pages at the very least
- Location mentions throughout your content
While not specifically on-topic, Rand's Whiteboard Friday about scaling geo-targeting offers plenty of great advice that can be applied here as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Key webpage fluctuating between page 2 and page 6 of Google SERP
Hi, We have found that one of our key webpages has been fluctuating between page 2 and page 6 of Google SERP for around 2 weeks. Some days it will be on page 6 in the morning and then page 2 in the afternoon. We have recently updated some copy on the page and wondered if this could be the cause. Has anyone else experienced this? If so how long was it before the page settled? https://www.mrisoftware.com/uk/products/property-management-software/ Thanks.
Algorithm Updates | | nfrank0 -
How often should I update the content on my pages?
I have started dropping on my rankings - due to lack of time after having a baby. I'm still managing to blog but I'm wondering if I update the content on my pages will that help? All my Meta tags and page descriptions were updated over a year ago - do I need to update these too? We were ranking in the top spots for a good few years, but we're slowly falling 😞 Please give me any advice to keep us from falling even further. I have claimed all my listings, and try to add new links once a month. I share my blog to all social sites and work hard to get Google reviews, we have 53 which is higher than any of our competitors. Any other ideas? Have I missed something that Google is looking for nowadays? Many thanks 🙂
Algorithm Updates | | Lauren16890 -
Log-in page ranking instead of homepage due to high traffic on login page! How to avoid?
Hi all, Our log-in page is ranking in SERP instead of homepage and some times both pages rank for the primary keyword we targeted. We have even dropped. I am looking for a solution for this. Three points here to consider is: Our log-in page is the most visited page and landing page on the website. Even there is the primary keyword in this page or not; same scenario continues Log-in page is the first link bots touch when they crawling any page of our website as log-in page is linked on top navigation menu If we move login page to sub-domain, will it works? I am worrying that we loose so much traffic to our website which will be taken away from log-in page sub domain Please guide with your valuable suggestions. Thanks
Algorithm Updates | | vtmoz0 -
Directories and Domain Authority
I read all the time about how directories have very little weight in SEO anymore, but in my field, a lot of our competitors are propped up by paying for "profiles" aka links from places like martindale-hubbard, superlawyers, findlaw, nolo, Avvo, etc (which are essentially directories IMO) yet all those sites have very high DAs of 80 and above. So, are links from these sites worth it? I know that's a vague questions, but if Moz's algo seems to rank them so highly, I'm guessing that's reasonably close to what google thinks as well...maybe? Thanks for any insight, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Deindexed from Google images Sep17th
We have a travel website that has been ranked in Google for 12-14years. The site produces original images with branding on them and have been for years ranking well. There's been no site changes. We have a Moz spamscore 1/17 and Domain Authority 59. Sep 17th all our images just disappeared from Google Image Search. Even searching for our domain with keyword photo results in nothing. I've checked our Search console and no email from Google and I see no postings on Moz and others relating to search algo changes with Images. I'm at a loss here.. does anyone have some advice?
Algorithm Updates | | danta2 -
Exact Keywords Domain name
Hello everyone!, I would love to have your opinion on this matter. I am working on a company e-commerce site; these guys would like to change their domain name AND their company name, so the most logical thing that came to mind was to name the domain after the company name. However, they also bought in the past a domain that have the exact keyword they would like to rank for. I know that keywords in the URL are not as important as they used to be in the past, but nonetheless when I do a Google search for those keywords, 3 domains out of 10 on the first page are slight variations of those same keywords, meaning that they might have a really good domain name (also the other result are government, medical stuff and so on). And, no matter how many times I have read that keywords in the URL are not so important anymore, I still see a lot of sites ranking also because of their domain name (well at least outside the US) So, my question here is: would it be better for them to use the exact match keyword-domain name or should they use their company name for their new site? Or some sort combination of the two? (the keyword-domain that in some way points also to the brand domain). Thanks for your opinions on this; really appreciate it! Cheers
Algorithm Updates | | Eyah0 -
Does a KML file have to be indexed by Google?
I'm currently using the Yoast Local SEO plugin for WordPress to generate my KML file which is linked to from the GeoSitemap. Check it out http://www.holycitycatering.com/sitemap_index.xml. A competitor of mine just told me that this isn't correct and that the link to the KML should be a downloadable file that's indexed in Google. This is the opposite of what Yoast is saying... "He's wrong. 🙂 And the KML isn't a file, it's being rendered. You wouldn't want it to be indexed anyway, you just want Google to find the information in there. What is the best way to create a KML? Should it be indexed?
Algorithm Updates | | projectassistant1 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0