Geo-targeting a sub-folder that's had url's rewritten from a sub-domain
-
I have a client that's setting up a section of his site in a different language, and we're planning to geo-target those pages to that country. I have suggested a sub-folder solution as it's the most cost effective solution, and it will allow domain authority to flow into those pages.
His developer is indicating that they can only set this up as a sub-domain, for technical reasons, but they're suggesting they can rewrite the url's to appear as sub folder pages.
I'm wondering how this will work in terms of geo-targeting in Google Webmaster Tools. Do I geo-target the sub domain or the sub folder i.e. does Google only see urls or does it physically see those pages on the sub-domain?
It seems like it might be a messy solution. Would it be a better idea just to forget about the rewrites and live with the site being a sub domain?
Thanks,
-
Ok. Thanks for the advise, Ryan.
-
My first suggestion is to push further on the "developer" issue. As an SEO, it is important to have the ability to implement recommended changes as required. If the changes are not implemented for whatever reason, results are affected.
We all work very hard to achieve the best results for our clients. Two common reasons a client might offer for not implementing a change are "my software wont support the change" and "my developer wont support the change". This topic will likely arise again on other matters. Additionally, I recommend a direct line of communication between an SEO and developer when possible. Each party can gain a higher understanding and appreciation for the other, miscommunications can be minimized and it simply creates a better working environment.
With the above noted, your decision to move the subdomain into the main site is the commonly accepted best practice. You are consolidating your DA. While Google has made some recent changes with respect to subdomains, it is still the best practice to make the change you have recommended to your client.
If the URLs are properly rewritten at the server level, no one will even know the actual path of the files. Anyone who visits the URL will simply see the page with a 200 response (all ok) header code returned. You can and should test this change after it is implemented.
Robots.txt can be used to block access to the sub-domain if you wish.
-
Thanks Ryan.
I've no direct contact with the developer, so I can't answer those questions. I'm afraid I just have to work with what my client is telling me.
By what you're saying, and if done correctly, the pages would look to google as if they were in a folder on that domain e.g. website.com/language-site, and we would geo-target that folder, and not the sub domain?
Then we'd need to find a way to stop the search engines crawling the sub-domain. Would this be done in the robots.txt file?
Do you think it we'd be just better off using the sub-domain and forgetting about the rewrites. The main reason I'm advising him to go for a folder structure is because of the uncertainty of domain authority flowing to a sub-domain.
-
I firmly believe software and developers should enable site owners the freedom to make changes as they see fit. When a developer or software are not able to readily implement SEO best practices, it's time to look for alternatives.
Is the software being used a particular CMS or e-commerce solution which is in an earlier stage of development? How experienced is the developer?
If the URLs were rewritten (server-side) to provide the target pages with a normal header response code the process should work. My biggest concern is ensuring the sub-domain URLs are not crawled otherwise there would be a duplicate content issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Domains Appearing in SERP - 1 .com, 1 ccTLD
Our global domain and our US ccTLD domain both appear for brand searches in the US. How do I recommend to our Tech team to fix this, as it skews our Organic traffic numbers between the two domains? The brand is Sportradar. (Sportradar.com / Sportradar.us )
International SEO | | mitchell-moz0 -
Targeting International Markets on the Web
Hello Moz community, I have a popular news website that we are looking to target multiple countries (all English first). So I know (1) a hosting provided (ip address) in that country and (2) a target extension (.co.uk) will help us. Am I missing anything else that can help when targeting international markets? What I'm struggling with is the duplicate content. I can't copy the content over to the extension because of the bad practice of duplicate content. Is it possible to have the same content on both websites and let Google know that it lives at the .com extension? If so, would those websites containing duplicate content still rank? And we would want to target different languages later (for example Spanish). This would be different content because it is in a different language, correct? Thanks for your help Moz community! Cole
International SEO | | ColeLusby0 -
Help targeting the USA in Search
I believe I've properly targeted, and told Google that our website absoluteautomation.com is aimed at US residents, while the .ca domain targets Canadian. However our .com domain routinely appears above our .ca when searching in Canada (actually on a physical computer in Canada) on google.ca. I'm hoping I can fix this both to improve Canadian search results, and I'm assuming that whatever is making .com appear so well in Canada is hurting it on the US side. Any ideas?
International SEO | | absoauto0 -
External URLs in hreflang sitemap questions
I'm currently putting together an international sitemap for a website that has an set up like the following: example.com/us
International SEO | | Guyboz
example.com/au
example.com/ca
example.co.uk
example.se I'm planning on including the hreflang tags within sitemaps for each domain, to make sure google serves up the right version. However, I'm a bit sceptical about including the non .com domains within the .com sitemap - and the other way round for .co.uk and .se sitemaps. The way I've been doing it follows the following example: <url><loc>http://www.example.com/us/</loc></url> Putting in the .co.uk and .se domains within the .com sitemap just doesn't feel right - is this actually the right way to do it? Thanks in advance 🙂0 -
Multinational URLs
Hi I'm wondering if the following URL structure using subdirectories would be alright to use on a multinational site. I have local products only in the local language and english. I plan to use: /uk/ - UK product in English (geo target in GWT to UK, href lang="en") /fr/ - French product in French (would geo target this in GWT to France, and hreflang="fr-FR") /fr-en/ - French product in English (no geo-targeting, hreflang ="en") /de/ - German product in German (would geo target this in GWT to Germany, and hreflang="de-DE") /de-en/ - German product in English (no geo-targeting, hreflang ="en") /at-de/ - Austrian product in German (would geo target this in GWT to Austria, and hreflang="at-DE") /at-en/ - Austrian product in English (no geo-targeting, hreflang ="en") Does the name of the subfolder matter? I've tried to keep the URL's shorter, so german users in Germany would get just /de/ rather than /de-de/, and have made the english version of the content the more ugly URL as it's used much, much less. The URL structures aren't really consistent here (ie. uk and fr-en are for english content, but are different in URL format) but I'm wondering if this is an issue, or if the above would be fine. Thanks!
International SEO | | pikka0 -
How to Best Manage Multiple Domains?
Hi,
International SEO | | thealika
I am new here and this is my first question.
(so please excuse if my etiquette slightly off) I have just taken over the SEO work for a website in South Africa (.co.za) it is for an Attorney of immigration law, and naturally I would love to make it into a star on google. I have about 15 extra keyword domains at my disposal, 5 of them are parked and the rest are not doing anything at the moment. so my question is: what should I do with them to get the best SEO results for their keyword names? I was thinking to make a WordPress Multi Site, un-park the domains and create a separate site for each domain. Create a visually similar front page, but all the links head back over to the main site. Then work on optimising the SEO for each domain. (lengthy work but it's not too hard to rank in google.co.za) what do you think? I also heard that parking domains is a bad Idea, because google sees it as duplicate content; is that so? website:
www.migrationlawyers.co.za Parked domains:
MigrationLawyers.co.za
MigrationLawyer.co.za
MigrationLawyers.de
ImmigrationLaw.co.za
EmigrationLaw.co.za Keyword domains: Migration-Attorney.com
Migration-Lawyers.com
MigrationCounsel.co.za
ApplyForPermanentResidencesSouthAfrica.com
AvoidDeportationSouthAfrica.co.za
AvoidDeportationSouthAfrica.com
RetirementVisaSouthAfrica.com
SouthAfricanCitizenship.co.za
SouthAfricanPermits.co.za
StudyPermitSouthAfrica.co.za Thanks a lot,
Nikita0 -
Why don't our English versions show up first?
If I google "greatfire" I find the Chinese version of our website (zh.greatfire.org) before the English version (en.greatfire.org). This is not on the Chinese-language version of Google. Why is this? Our site even has a language indicator () and also hints of where the English version is (). The same thing happens if I google "freeweibo". I find https://freeweibo.com but not https://freeweibo.com/en/, even though we indicate that's the English version (). Any ideas?
International SEO | | GreatFire.org0 -
Multilingual site - separate domain or all under the same umbrella
this has been asked before with not clear winner. I am trying to sum up pros and cons of doing a multilingual site and sharing the same domain for all languages or breaking it into dedicated subdomains e.g. as an example lets assume we are talking about a french property portal with an english version as well. Assume most of the current incoming links and traffic is from France. A) www.french-name.fr/fr/pageX for the french version www.english-name.com/en/pageX for the english version B) www.french-name.fr/fr/ for the french name (as is) www.french-name.fr/en for the english version the client currently follows approach A but is thinking to move towards B we see the following pros and cons for B take advantage of the french-name.fr domain strength and incoming links scalable: can add more languages without registering and building SE position for each one individually potential issues with duplicate content as we are not able to geotarget differenly on web master tools of google potential dilution of each page's strength as we will now have much more pages under the same domain (double the pages basically) - is this a valid concern? usability/marketing concerns as the name of the site is not in english (but then people looking for a house in France would be at least not completely alien to it) what are your thoughts on this? thanks in advance
International SEO | | seo-cat0