Google UK picking up USA Site
-
I have a site with two subfolders one is .../uk and one is .../us
Part of the content on the two sites is the same and part is unique. The US site's language is set to en and the UK site's language is set to en_gb. I have setup geo-targeting in webmaster tools.
The problem is that the home page is a GEO-IP redirect and it seems to be picking up information from the US site even on google uk.
I'm not concerned too much about getting the uk site crawled as we submit a sitemap for that anyway.
But my concern is that if I setup the geo-ip redirect as a 301 will my UK site loose all of it's ranking?
Also am I likely to be penalised for duplicate content?
-
Super answers, very helpful Thank you
-
Hi Matthew,
The hreflang tag which SEOBrent mentioned below may help here - if your content is duplicate.
In terms of implementation you need to place the tag on all versions of the page -
So on http://yourdomain.com/us/page-whatever you'd need:
http://yourdomain.com/uk/page-whatever; /> http://yourdomain.com/us/page-whatever/>
And on http://yourdomain.com/uk/page-whatever - you'd need:
http://yourdomain.com/us/page-whatever; /> https://plus.google.com/u/0/115984868678744352358/posts/9zA3a96XahN
Please also note, you need to pick one canonical version - I've picked the US as I've assumed you get more US traffic - obviously if you get more UK traffic then that should probably be the canonical version. See here - http://www.google.com/support/forum/p/Webmasters/thread?tid=5299f52953459957&hl=en&fid=5299f529534599570004b9a18f1d46fe
Re your GEO-IP redirect - this isn't an approach I favour - as Google normally crawl from a US IP it can cause indexation problems. As such I prefer something like cheapflights.com implement - if you visit cheapflights.com from a UK IP you are pushed to this international choice page - http://www.cheapflights.com/workers/profile-select.aspx?sref=CFUK&redirect=GeoIP&geoip=GB&cfref=CFUS&spt=Home&rp=/
This allows user to select the appropriate site, and allows both versions to be crawled.
I hope this helps,
Hannah
-
One of the things mentioned at SMX this week was the new "alternate" "hreflang" tag.
Put this tag at the top of your US site:
rel=”alternate” hreflang=”en-GB” href=”http://www.example.co.uk/” />
rel=”alternate” hreflang=”en-US” href=”http://www.example.com/” />
This will tell the engines that yes you have two "duplicate pages" but this is the correct language for each.
More on alt hreflang here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multilang site: Auto redirect 301 or 302?
We need to establish if 301 or 302 response code is to be used for our auto redirects based on Accept-Language header. https://domain.com
International SEO | | fJ66doneOIdDpj
30x > https://domain.com/en
30x > https://domain.com/ru
30x > https://domain.com/de The site architecture is set up with proper inline HREFLANG.
We have read different opinions about this, Ahrefs says 302 is the correct one:
https://ahrefs.com/blog/301-vs-302-redirects/
302 redirect:
"You want to redirect users to the right version of the site for them (based on location/language)." You could argue that the root redirect is never permanent as it varies based on user language settings (302)
On the other hand, the lang specific redirects are permanent per language: IF Accept-Language header = en
https://domain.com > 301 > https://domain.com/en
IF Accept-Language header = ru
https://domain.com > 301 > https://domain.com/ru So each of these is 'permanent'. So which is the correct?0 -
Why Doesn't Google Use My Title Tag and Meta Description?
Hi fellow Moz SEOs, Need your URGENT help! We set an optimised title & meta description for our client websites. These titles are approved by our clients. Before somedays, they checked on Google, noticed the title & meta description were not the same. Next moment, they notified me about this issue. The title & meta description looks fine when I checked the source code. So, why Google use title & meta description differently? For example: Title approved by client: Top Specialist Divorce & Family Lawyer - Yeo & Associates LLC
International SEO | | Verz
Google set our title: Yeo & Associates LLC: Top Specialist Divorce & Family Lawyer Title approved by client: Filing For Divorce Online in Singapore | DivorceBureau®
Google set our title: DivorceBureau®: Filing For Divorce Online in Singapore Title approved by client: Halal Buffet & Bento/Packet Meals Event Caterer Singapore | Foodtalks
Google set our title: Foodtalks - Halal Buffet & Bento/Packet Meals Event Caterer Singapore Title approved by client: Child Care Centre in Singapore| Top Preschool | Carpe Diem
Google set our title: Carpe Diem: Child care Centre in Singapore| Top Preschool Every day, they are requesting me to update Google's title with their approved title. Also, asking me these questions.
Why did this happen?
Why didn't set their recommended title? Is there any way to set our approved titles? Please, help me to find the solution. ASAP Thanks in advance!0 -
International Targeting: What Does Google Consider an Equivalent Page?
Hi All, We are working with an international brand that owns several domains across the EU and in North America. Our team is in the process of setting up international targeting using sitemaps to indicate alternate language pages. This is being done to prevent North American pages from being served in the UK, Spain pages from being served in Portugal, or any other combination of possibilities... Currently we are mapping duplicate or “equivalent” pages and defining them as rel="alternate" on their respective sitemaps. The problem is, it’s not always explicitly clear what Google considers “equivalent.” 1. In this instance, URL structures vary by domain,
International SEO | | MetaPaul
2. in most cases the content is similar (but unique),
3. the landing page templates vary is design and functionality,
4. and lastly, services often contain nuances that make them slightly different from one another (Professional Liability Insurance vs Professional Indemnity Insurance). All things considered, these pages are offering the same service, but are vastly different (see above). Q: Is it appropriate to use these attributes to serve the correct language / regional URL to searchers? Q: Is there a rule of thumb on what should be considered an "equivalent" page? Thanks All, Paul3 -
What's the difference between 'en-gb' and 'en-uk; when choosing Search engines in campaign set up?
Hi What's the difference search engine wise and which one should I choose, i presume GB since covers entire British landmass whereas UK excludes Ireland according to political definition, is it the same according to Google (& other engines) ? All Best Dan
International SEO | | Dan-Lawrence0 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi,
International SEO | | Awaraman
I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons and how important is it to include keywords to folders and page URLs. Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I undertstand that this is a common practice to use subdomains or folders to separate the language versions. My question is regarding the subfolder. Is it better to have only the subfolder shown (www.example.com/en) or should you also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam0 -
Geo Targeting SEO Techniques for Google UK
I'm starting a new SEO project whereby I'll be targeting UK search engines only such as Google.co.uk, (I'm from the states) and I'm gathering all the information I can get on this topic Obviously, I got a CO.UK TLD, and hosting/IP is UK based, but can anyone shed light on other techniques that has worked for you, Besides of the above here is some advice I picked up so far; Regional directory listings,
International SEO | | Plorex
Inbound and outbound inks from/to UK based websites,
Geographic targeting in Google webmaster tools,
British slang... What else is there?
Much appreciated0 -
Google search cache points to and uses content from different url
We have two sites, 1 in new zealand: ecostore.co.nz and 1 in Australia: ecostoreaustralia.com.au Both sites have been assigned with the correct country in Webmaster tools Both site use the same urls structure and content for product and category pages Both sites run off the same server in the US but have unique ip adresses. When I go to google.com.au and search for: site:ecostoreaustralia.com.au I get results which google says are from the Australian domain yet on closer inspection it is actually drawing content from the NZ website. When I view a cached page the URL bar displays the AU domain name but on the page (in the top grey box) it says: _This is Google's cache of http://www.ecostore.co.nz/pages/our-highlights. _ Here is the link to this page: http://webcache.googleusercontent.com/search?q=cache:Zg_CYkqyjP4J:www.ecostoreaustralia.com.au/pages/our-highlights+&cd=7&hl=en&ct=clnk&gl=au In the last four weeks the ranking of the AU website has dropped significantly and the NZ site now ranks first in Google AU, where before the AU site was listed first. Any idea what is going wrong here?
International SEO | | ArchMedia0 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0