US domain pages showing up in Google UK SERP
-
Hi,
Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au)
Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental.
However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones.
Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue?
Thanks in advance,
R
-
As your own agency told, I too consider that when the hreflang will be implemented, this kind of issues should terminate.
Regarding the sitemap error, it was surely something that could be confusing Google about what site to target.
However, I see that you have also an .eu domain name...
I imagine that that domain is meant for targeting the European market and I suspect that it is in English.
If it is so, remember:
- In countries like Spain, France, Germany, italy... we don't search in Internet using English, but Spanish, French, German, Italian... Therefore, that .eu domain is not going to offer you those results you maybe are looking for;
- The .eu domain termination is a generic one, and cannot be geotargeted via Google Search Console. This means that - by default - it targets all the world, hence, you probably can see visits from English speaking users in countries like South Africa, UK, IE, Australia, New Zealand or India, where English is the main language or one of the official ones;
- When it comes to domains like .eu and hreflang, it is always hard to decide how to implement it. In your specific case, as you are targeting UK, US, AU and IE with specific domain names, the ideal would be to implement this hreflang annotation for the .eu (the example is only for the home page):
<rel="alternate" href="http://www.domain.eu" hreflang="x-default"><rel="alternate" href="http://www.domain.eu" hreflang="en"><rel="alternate" href="http://www.domain.com" hreflang="en-GB"><rel="alternate" href="http://www.domain.us" hreflang="en-US"><rel="alternate" href="http://www.domain.com.au" hreflang="en-AU"></rel="alternate"></rel="alternate"></rel="alternate"></rel="alternate"></rel="alternate">
With those annotations, you are telling Google to show the .com to users in Great Britain, the .us to users in United States, the .au to Australian ones and the .eu to all the other users using English in any other country.
That will mean that your .eu site surely will target also users in others European countries, both using english when searching (hreflang="en") and other languages (hreflang="x-default").
2 notes about the hreflang="x-default":
-
People living in the UK and searching in Spanish will see the .eu domain name, because it is the default domain name for searches in every language but English in GB, IE, AU and US;
-
Again, even if you pretend the .eu domain to target only European countries, that is impossible, because the .eu termination doesn't have any geotargeting power (and regions like Europe or Asia cannot be geotargeted via GSC). So it will be normal to see visit also from countries in others continents.
-
You're very welcome. Either way I'd be interested to see how this one progresses.
-
Hi Chris,
Thanks for your quick response and detailing out this well.
I have backdated and noticed that this occurs almost every six months. The US domain urls pop up in the UK SERPs for about 2 weeks and disappear after that. We are yet to implement the href lang tags on site and our SEO agency confirm that this should fix the issue.
Will keep this thread updated on the outcome.
Cheers,
RG
-
Whether or not this is an issue kind of depends on what your product or service is. If you provide a local-only service like a restaurant then your US site ranking in the UK would be unusual.
On the other hand, if you sell a physical product this may not be so unusual. For example, here in Australia we're quite limited when it comes to finding men's online clothing stores, most of it comes from the US or the UK so it's not uncommon to see something like the US Jackthreads show up in the SERPs here.
Since you do have separate domains for each location, this might be an indication that search engines aren't really understanding the different jurisdictions for each site; maybe they're not geo-targeted enough for the algorithm to comprehend the fact that each of the 3 sites server a unique area.
Some of the elements that can help define this, in no particular order:
- Server location
- HTML language ( e.g. lang="en-US")
- Regional language differences (e.g. US spelling vs UK)
- Location markup - on your location pages at the very least
- Location mentions throughout your content
While not specifically on-topic, Rand's Whiteboard Friday about scaling geo-targeting offers plenty of great advice that can be applied here as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
Hi all, Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains. Thanks
Algorithm Updates | | vtmoz0 -
Linking from high ranking sub domain pages to less ranking main domain pages to benefit latter
Hi all, We have our product guide pages on sub domain which are years old, so have some backlinks and high ranking for the beand related queries. Now we created new guide pages on our main website and we want these new pages to rank top beating the old pages from sub domain. Again we can't deindex or rel canonical to solve the issue as there are some part of users still using the old pages. We are planning to give a link from every old page of sub domain to same new page on main domain. Will this linking increases the authority of new pages technically and helps in ranking better? Like we give a link to "Moz guide 1" page to "Moz guide 2" page to rank latter better. Thanks
Algorithm Updates | | vtmoz0 -
Ecommerce SEO: Is it bad to link to product/category pages directly from content pages?
Hi ! In Moz' Whiteboard friday video Headline Writing and Title Tag SEO in a Clickbait World, Rand is talking about (among other things) best practices related to linking between search, clickbait and conversion pages. For a client of ours, a cosmetics and make-up retailer, we are planning to build content pages around related keywords, for example video, pictures and text about make-up and fashion in order to best target and capture search traffic related to make-up that is prevalent earlier in the costumer journey. Among other things, we plan to use these content pages to link directly to some of the products. For example a content piece about how to achieve full lashes will to link to particular mascaras and/or the mascara category) Things is, in the Whiteboard video Rand Says:
Algorithm Updates | | Inevo
_"..So your click-bait piece, a lot of times with click-bait pieces they're going to perform worse if you go over and try and link directly to your conversion page, because it looks like you're trying to sell people something. That's not what plays on Facebook, on Twitter, on social media in general. What plays is, "Hey, this is just entertainment, and I can just visit this piece and it's fun and funny and interesting." _ Does this mean linking directly to products pages (or category pages) from content pages is bad? Will Google think that, since we are also trying to sell something with the same piece of content, we do not deserve to rank that well on the content, and won't be considered that relevant for a search query where people are looking for make-up tips and make-up guides? Also.. is there any difference between linking from content to categories vs. products? ..I mean, a category page is not a conversion page the same way a products page is. Looking forward to your answers 🙂0 -
Has Google Authorship been completely removed from SERPs?
Noticed today that when I search (non-personalised search, incognito etc.) some of my pages on Google ALL references to authorship have now been completely removed. Does anyone know when this change occurred? I might be a bit slow this week (or last week) with concentrating on projects. I know like others that photos went some time back but now there are no author details being displayed. Just the page title and description. David
Algorithm Updates | | David-E-Carey1 -
Organic Search Result in google
Hello! Actually, I would like to know the major check points which decides the organic search result[Google]. I see many of the sites in first page which are not even having good level of page and domain authority. I am a beginner but i have done all the score card checkpoints and issue free pages 🙂 Some where i dropped on organic search result. Ex Keyword : blikkenslager Targeted page : http://www.nortekk.no/vi-utforer/blikkenslager-15/ Search Engine : Google.no [norsk (nynorsk)] Thank you for your help!
Algorithm Updates | | Webworld_Norway0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
Help on Page Load Time
I'm trying to track page load time of the visits on my site and GA only says to me that it's equal to zero and page load sample is aways zero too. I've made a research, and I found that GA is used to track page load time automatically, isn't it?
Algorithm Updates | | ivan.precisodisso0