Geo-targeting a sub-folder that's had url's rewritten from a sub-domain
-
I have a client that's setting up a section of his site in a different language, and we're planning to geo-target those pages to that country. I have suggested a sub-folder solution as it's the most cost effective solution, and it will allow domain authority to flow into those pages.
His developer is indicating that they can only set this up as a sub-domain, for technical reasons, but they're suggesting they can rewrite the url's to appear as sub folder pages.
I'm wondering how this will work in terms of geo-targeting in Google Webmaster Tools. Do I geo-target the sub domain or the sub folder i.e. does Google only see urls or does it physically see those pages on the sub-domain?
It seems like it might be a messy solution. Would it be a better idea just to forget about the rewrites and live with the site being a sub domain?
Thanks,
-
Ok. Thanks for the advise, Ryan.
-
My first suggestion is to push further on the "developer" issue. As an SEO, it is important to have the ability to implement recommended changes as required. If the changes are not implemented for whatever reason, results are affected.
We all work very hard to achieve the best results for our clients. Two common reasons a client might offer for not implementing a change are "my software wont support the change" and "my developer wont support the change". This topic will likely arise again on other matters. Additionally, I recommend a direct line of communication between an SEO and developer when possible. Each party can gain a higher understanding and appreciation for the other, miscommunications can be minimized and it simply creates a better working environment.
With the above noted, your decision to move the subdomain into the main site is the commonly accepted best practice. You are consolidating your DA. While Google has made some recent changes with respect to subdomains, it is still the best practice to make the change you have recommended to your client.
If the URLs are properly rewritten at the server level, no one will even know the actual path of the files. Anyone who visits the URL will simply see the page with a 200 response (all ok) header code returned. You can and should test this change after it is implemented.
Robots.txt can be used to block access to the sub-domain if you wish.
-
Thanks Ryan.
I've no direct contact with the developer, so I can't answer those questions. I'm afraid I just have to work with what my client is telling me.
By what you're saying, and if done correctly, the pages would look to google as if they were in a folder on that domain e.g. website.com/language-site, and we would geo-target that folder, and not the sub domain?
Then we'd need to find a way to stop the search engines crawling the sub-domain. Would this be done in the robots.txt file?
Do you think it we'd be just better off using the sub-domain and forgetting about the rewrites. The main reason I'm advising him to go for a folder structure is because of the uncertainty of domain authority flowing to a sub-domain.
-
I firmly believe software and developers should enable site owners the freedom to make changes as they see fit. When a developer or software are not able to readily implement SEO best practices, it's time to look for alternatives.
Is the software being used a particular CMS or e-commerce solution which is in an earlier stage of development? How experienced is the developer?
If the URLs were rewritten (server-side) to provide the target pages with a normal header response code the process should work. My biggest concern is ensuring the sub-domain URLs are not crawled otherwise there would be a duplicate content issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Important pages are being 302 redirected, then 301 redirected to support language versions. Is this affecting negatively the linking juice distribution of our domain?
Hi mozzers, Prior to my arrival, in order to support and better serve the international locations and offering multiple language versions of the same content the company decided to restructure its URLs focused on locale urls. We went from
International SEO | | Ty1986
https://example.com/subfolder to https://example.com/us/en-us/new-subfolder (US)
https://example.com/ca/en-us/new-subfolder (CAN)
https://example.com/ca/fr-ca/new-subfolder (CAN)
https://example.com/de/en-us/new-subfolder (Ger)
https://example.com/de/de-de/new-subfolder (Ger) This had implications on redirecting old URLs to new ones. All important URLs such as https://example.com/subfolder were
302 redirected to https://example.com/us/en-us/subfolder and then 301 redirected to the final URL. According to the devs: If you change the translation to the page or locale, then a 302 needs to happen so you see the same version of the page in German or French, then a 301 redirect happens from the legacy URL to the new version. If the 302 redirect was skipped, then you would only be able to one version/language of that page.
For instance:
http://example.com/subfolder/state/city --> 301 redirect to {LEGACY URL]
https://example.com/subfolder/state/city --> 302 redirect to
https://example.com/en-us/subfolder/state/city --> 301 redirect to
https://example.com/us/en-us/new-subfolder/city-state [NEW URL] I am wondering if these 302s are hurting our link juice distribution or that is completely fine since they all end up as a 301 redirect? Thanks.1 -
Shall I automatically redirect international visitors from www.domain.com to e.g. www.domain.com/es? What is best SEO practice?
We have chosen the one domain approach with our international site having different language versions in subdirectory of main domain:
International SEO | | lcourse
www.domain.com/es
www.domain.com/it
etc. What is SEO-wise best practice for implementing international index pages. I see following options: entering www.domain.com will display without redirection the index page in language of user (e.g based on IP or browser) in www.domain.com
Example: www.booking.com entering www.domain.com will always show English index page.
Additionally one may display a message in the header if IP from other country with link to other language version.
Example: www.apple.com entering www.domain.com will always redirect automatically to country specific subdirectory based on IP
Example: www.samsung.com Any thoughts/suggestions on what may be best solution from a SEO perspective? For a user I believe options 1) & 3) are preferable.0 -
Multiple geographic targeting in Europe
I have a site that is applicable to German speaking people in central europe. If I were to geotarget the site in google webmaster tools to Germany, would that prevent users in Switzerland or Austria seeing the site in their search results.
International SEO | | zeropointlabs0 -
Multiple Regional Domains - such as .co.uk / .de etc for one brand
Hello, We are in the process of building up our version 2 for our site, currently we have only one domain (i.e. xxxxx.com). Our target audience is distributed among various regions and speak different languages, we would like to know which will benefit us more: a) by having one root domain and then having folders based on automatic IP detection, for example the customer opening a website in Japan would see the domain as: www.xxxx.com/jp. B) or is it better to have different domains so in the above case it will be www.xxxx.co.jp. The content on the site will be different based on the regional demand, so of course the language will be Japanese and the content will also be aligned with the Japanese community. We plan to start with 5 different markets (UK/US/AU, Japan, China, Germany, Spanish speaking countries). We would appreciate if you can suggest us the best route to achieve the best results. Thank you, SK
International SEO | | sidkumar0 -
Thai Characters in URL's
Does anyone have experience with non-Latin characters in URL's? We've launched a website in Thailand and picked Thai characters for URL's. However, when you copy it, it turns into something like this: http://www.imoneythailand.com/บัตรเครดิต Can it impact our website's crawlability? Also, is keyword in URL a ranking factor for non-Latin languages? Thanks in advance for help!
International SEO | | imoney0 -
Google search cache points to and uses content from different url
We have two sites, 1 in new zealand: ecostore.co.nz and 1 in Australia: ecostoreaustralia.com.au Both sites have been assigned with the correct country in Webmaster tools Both site use the same urls structure and content for product and category pages Both sites run off the same server in the US but have unique ip adresses. When I go to google.com.au and search for: site:ecostoreaustralia.com.au I get results which google says are from the Australian domain yet on closer inspection it is actually drawing content from the NZ website. When I view a cached page the URL bar displays the AU domain name but on the page (in the top grey box) it says: _This is Google's cache of http://www.ecostore.co.nz/pages/our-highlights. _ Here is the link to this page: http://webcache.googleusercontent.com/search?q=cache:Zg_CYkqyjP4J:www.ecostoreaustralia.com.au/pages/our-highlights+&cd=7&hl=en&ct=clnk&gl=au In the last four weeks the ranking of the AU website has dropped significantly and the NZ site now ranks first in Google AU, where before the AU site was listed first. Any idea what is going wrong here?
International SEO | | ArchMedia0 -
Geo targeting issue and hosting
Hi guys and gals, this is not a problem per se, but an oddity that I would appreciate some insight on from the big juicy brains in this community. Our site had hosting in the US, and I was concerned that therefore our relevance to our own country (Australia) was diminished because of it. For one of our main keywords we were a few spots behind the competitor on the 1st page for an australian searcher, but when i searched the same keyword from Google.com with gl=us to show US only results, we outranked the competitors by a few spots. On page elements aside (if anything we had more geo identifiers on the ranking page in question) I wanted to move hosts anyway and got hosting in Australia. The next week our search traffic jumped by 25%. But it was almost all US traffic. Australian traffic was unchanged. Any idea how this could happen? It's an .AU domain, hosted in Australia, with on page clearly identifying Australia. I checked webmaster tools and our geo is properly set to Australia. I checked the keywords that the traffic increased for and they are not geo specific at all. Besides that I don't know how else to pin this down. Thanks.
International SEO | | Digital3600 -
De-Indexing URLs from a specific Locale
Is it possible to de-index a specific URL from showing up in a specific locale? For example, if I want to de-index http://www.example.com/category/product1 from http://www.google.co.uk but not http://www.google.com, is that possible?
International SEO | | craigsmith3330