What is the best strategy for a company in various countries?
-
Hello I have to make yt SEO marketing strategy for a company that provides services in Spain, Colombia and Mexico
I'm looking at two options:-
Buy different domains (TLD): This option seems feasible but very expensive and manage each domain position it would have to have different content in each (plus you would not know that because it is put exactly the same domain)
-
Place each service and country folders eg
www.dominio.com / mexico / training-financiero.html
www.dominio.com / espana / training-financiero.html
I have understood that option 1 is no longer necessary since you can use html tags within the code to tell Google that you try to target content to customers from a different country.
In principle we would use the same content would change only a few words and of course the currency to suit the local currency of each country.However I believe that customers could rely more on a domain if their country. Plus I'm afraid I google indexed as duplicate content is another matter What country would main that could confuse the visitor?
-
-
First of all I really suggest you read this post by Aleyda Solis in the SEERInteractive Blog, which practically answers to all your doubts in a very extensive way.
Second, the subfolder option seems the best one for your case, at least from what I understood about your businesses' needs. In fact, business's objective in middle/long term should determine what option for International SEO you should choose, and not simply the mere SEO aspect.
So, if you choose to the subfolder option, you must:
-
create the subfolders (obvious)
-
geotarget them in Google Webmaster Tools
-
implement the use of rel="alternate" hreflang markup, especially because you are using the same content in different subfolders, hence, if you want to be really sure that Google shows the correct URLs for the given country, better to use the hreflang.
-
even if you do 2) and 3) I strongly suggest you to localize the content of your site. The Spanish spoken in Mexico, Argentina and Spain is quite different, as it is the English spoken in UK and USA. The more the language used is fitting the culture of the countries you are targeting, the better not only in SEO terms, but also in the conversion potentialities ones.
Finally, related to buying also the Country Level Domain versions of your main domain, actually buying them and redirecting 301 to the respective subfolders doesn't have really any SEO effect, but it may be an useful way to "reserve" those domains for a potential future use.
-
-
Hey, things are a little clearer. I'm planning to do the following
- Separate the website into folders for each country and service
www.dominio.com/argentina/seo.html
Place the respective Geolocation labels in the source code
-
Sign into Google Webmaster Tools and set the hearing for each subfolder
-
Buy the TLD domains for each country
Not to do with the TLD domains. What do you recommend me?
I was thinking of putting a landing page with a contact form where it counts in summary we do (original content of course) and that each service link to the folder. Com domain on the country
Or just that when they enter the TLD domain redirect (Redirect 301) to the subfolder of the respective country in the domain. Com
- Separate the website into folders for each country and service
-
Hi,
2 really good links below discussing this topic:
How to do SEO for different countries
International SEO - Whiteboard Session
Cheers,
-
First of all thanks for your quick response. Actually I would not know how to approach the service differently for two countries with the same language. Suppose you offer SEO service to Argentina and one for Colombia (In both countries the language is Spanish) you could say differently in service to Colombia regarding SEO SEO service to Argentina?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to fix duplicate content issues
Another question for the Moz Community. One of my clients has 4.5k duplicate content issues. For example: http://www.example.co.uk/blog and http://www.example.co.uk/index.php?route=blog/blog/listblog&year=2017. Most of the issues are coming from product pages. My initial thoughts are to set up 301 redirects in the first instance and if the issue persists, add canonical tags. Is this the best way of tackling this issue?
Technical SEO | | Laura-EMC0 -
Duplicate content pages on different domains, best practice?
Hi, We are running directory sites on different domains of different countries (we have the country name in the domain name of each site) and we have the same static page on each one, well, we have more of them but I would like to exemplify one static page for the sake of simplicity. So we have http://firstcountry.com/faq.html, http://secondcountry.com/faq.html and so on for 6-7 sites, faq.html from one country and the other have 94% similarity when checked against duplicate content. We would like an alternative approach to canonical cause the content couldn´t belong to only one of this sites, it belongs to all. Second option would be unindex all but one country. It´s syndicated content but we cannot link back to the source cause there is none. Thanks for taking the time in reading this.
Technical SEO | | seosogood0 -
Disallow: /404/ - Best Practice?
Hello Moz Community, My developer has added this to my robots.txt file: Disallow: /404/ Is this considered good practice in the world of SEO? Would you do it with your clients? I feel he has great development knowledge but isn't too well versed in SEO. Thank you in advanced, Nico.
Technical SEO | | niconico1011 -
Best way to implement noindex tags on archived blogs
Hi, I have approximately 100 old blogs that I believe are of interest to web browsers that I'd potentially like to noindex due to the fact that they may be viewed poorly by Google, but I'd like to keep on our website. A lot of the content in the blogs is similar to one another (as we blog about the same topics quite often), which is why I believe it may be in our interests to noindex older blogs that we have newer content for on more recent blogs. Firstly does that sound like a good idea? Secondly, can I use Google Tag Manager to implement noindex tags on specific blog pages? It's a hassle to get the webmaster to add in the code, and I've found no mention of whether you can implement such tags on Tag Manager on the usual SEO blogs. Or is there a better way to implement noindex tags en masse? Thanks!
Technical SEO | | TheCarnage0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Best use of robots.txt for "garbage" links from Joomla!
I recently started out on Seomoz and is trying to make some cleanup according to the campaign report i received. One of my biggest gripes is the point of "Dublicate Page Content". Right now im having over 200 pages with dublicate page content. Now.. This is triggerede because Seomoz have snagged up auto generated links from my site. My site has a "send to freind" feature, and every time someone wants to send a article or a product to a friend via email a pop-up appears. Now it seems like the pop-up pages has been snagged by the seomoz spider,however these pages is something i would never want to index in Google. So i just want to get rid of them. Now to my question I guess the best solution is to make a general rule via robots.txt, so that these pages is not indexed and considered by google at all. But, how do i do this? what should my syntax be? A lof of the links looks like this, but has different id numbers according to the product that is being send: http://mywebshop.dk/index.php?option=com_redshop&view=send_friend&pid=39&tmpl=component&Itemid=167 I guess i need a rule that grabs the following and makes google ignore links that contains this: view=send_friend
Technical SEO | | teleman0 -
Best practice for XML sitemap depth
We run an eCommerce for education products with 20 or so subject based catalogues (Maths, Literacy etc) and each catalogue having numerous ranges (Counting, Maths Games etc) then products within those. We carry approximately 15,000 products. My question is around the sitemap we submit - nightly - and it's depth. It is currently set to cover off home, catalogues and ranges plus all static content (about us etc). Should we be submitting sitemaps to include product pages as well? Does it matter or would it not make much difference in terms of search. Thanks in advance.
Technical SEO | | TTS_Group0 -
What is the best way to upload an image for SEO
I have a site that is largely based on images. It runs on Wordpress. Each page has about 10 images. What is the best way to upload images? As a WP gallery. As another gallery (with a gallery plugin) As seperate images uploaded via WP (it shows a thumbnail which is links to the larger image). That way each image can have a title. As seperate images uploaded via FTP in which case I would then make a thumbnail which will link to the larger image. This option would only be good if WP does not optomize the thumbnail images that it creates. As a text title which links to the image. Since most of my content is based on images and mostly comes from Google images I was wondering what the best method to use is.
Technical SEO | | nicolebd0