Best practices for country homepage
-
Hi,
What are the SEO best practices for redirecting to the correct language site based on geographic location?
Right now, we're using a 302 redirect to point users to the right country landing page.
User reaches site: domain.com > Server detects location > 302 redirect to domain.com/french
We'd like to optimize the site for all languages, but which country gets the SEO rank for the domain.com?
Thanks for your help!
Roya
-
OZNAPPIES' idea is a good one, and the technical advice is totally sound.
Here's another idea: show the English language homepage by default, but determine the user's preferred language. If your site supports the identified language, include & display a hyperlinked text-box/graphic-banner suggesting the visitor "view this site in {the user's identified language}" which links to the homepage for that language.
Regarding which country gets the SEO rank for the "domain.com" - it depends on where your links come from, your overall domain authority, the page authority for your targeted page, and the geolocation factors referenced here.
The same site could rank well in multiple countries, by the way. For example, this page on "diagnostic troubleshooting" ranks well in the U.S., the U.K., France, etc. However, if you have the resources to make targeted sites for each language/territory, I recommend you do that.
-
If you want domain.com to get the rank for domain.com/french you can set in the head section of ./french .italian etc to pass the juice back to the main site. I would think that <a (anchors)="" would="" be="" better="" than="" 302's="" as="" these="" are="" not="" temp="" redirects="" but="" links="" to="" language="" variations.="" i="" also="" ensure="" that="" have="" a="" tag="" in="" the="" html="" section="" <span=""><html xmlns="http://www.w3.org/1999/xhtml" xml:lang="fr-fr" lang="fr-fr"></a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wrong sitelinks (from the old homepage on the domain)
Hi there, I am quite a novice when it comes to SEO. So you'll have to forgive me if the questions has already been asked by someone before me. The question is about site links in google. Google does generate som sitelinks, but some of the are seemingly generated on the basis of the old homepage. All the sitelinks corresponds to menu items, but two of those items exist only in the homepage on the domain. The fact that the SERP entry is not exactly as we wanted it to be is not strange at alle. We have yet to write in a propper metadescription, and optimize for the right keywords. Because that, I gather from my research, has a great role to play for googles decisions about which lins to use as sitelinks. But I do not understand why google decides to use link names on a page that is no longer there as sitelinks on a new page without those link names. I would like to understand this before experimenting with keywords and metadescriptions. I will appreciate an explanation as well as any advice on the relation been keywords and metadescritions. Happy day, Plovsky First, some context: 1. We just changed the homepage on our domain, pointen the domain name to the IP the new webhost and the new website.
Technical SEO | | plovsky0 -
Best way to create robots.txt for my website
How I can create robots.txt file for my website guitarcontrol.com ? It is having login and Guitar lessons.
Technical SEO | | zoe.wilson170 -
Sites for English speaking countries: Duplicate Content - What to do?
HI, We are planning to launch sites specific to target market (geographic location) but the products and services are similar in all those markets as we sell software.So here's the scenario: Our target markets are all English speaking countries i.e. Britain, USA and India We don't have the option of using ccTLD like .co.uk, co.in etc. How should we handle the content? Because product, its features, industries it caters to and our services are common irrespective of market. Whether we go with sub-directory or sub-domain, the content will be in English. So how should we craft the content? Is writing the unique content for the same product thrice the only option? Regards
Technical SEO | | IM_Learner0 -
Homepage disappeared from Google Serp
I redirected my domain using this code in .htaccess : RewriteCond %{HTTP_HOST} ^xxxx.com
Technical SEO | | digitalkiddie
RewriteRule (.*) http://www.xxxx.com/$1 [R=301,L]
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]/)index.(html?|php)(?[^\ ])?\ HTTP/
RewriteRule ^(([^/]/)*)index.(html?|php)$ http://www.xxxx.com/$1 [R=301,L]</ifmodule> A day after I did it, got an error in GWMT "Google can't find your site's robots.txt" and my homepage disappeared from the result pages. When I try to open Google cache of the homepage I got an error 404. I generated new robots.txt, uploaded it , now the error doesnt show but still my homepage is not in the serps. Its been 3 days. What should I do ? Thanks in advance "Google can't find your site's robots.txt" error? - Pro ...0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
What is the best image format to put on your site
Hi at the moment i am working with images to try and speed up my site and i am wondering what is the best format to save images and then put on my site. I have been playing around with photoshop where they have the following formats png-24 gif (but not sure which one i should choose or jpeg I would be grateful for your advice and also to know what size i should try and keep the image down to many thanks
Technical SEO | | ClaireH-1848860 -
Optimizing one site for multiple countries
I am working on a project, where we have one website, with a country specific domain, which is currently ranking well in local search. The client now wants to expand his business into two new countries (all english speaking) and would like to rank for the same keywords in these two new countries. The customer do not want to create new websites for the new countries. Because its a local domain and the website is setup for local search in GWT with locally hosted server, i expect challenges in optimizing for new countries without impacting the current local ranking. Question 1: What would be the recommended approach for maintaining their existing ranking on local search, while optimizing for the new countries.
Technical SEO | | petersen0