What are the best practices for geo-targeting by sub-folders?
-
My domain is currently targeting the US, but I'm building out sub-folders that will need to geo-target France, England, and Spain. Each country will have it's own sub-folder, and professionally translated (domain.com/france).
Other than the hreflang tags, what are other best practices I can implement? Can Google Webmaster tools geo-target by subfolder? Any suggestions would be appreciated.
Thanks
Justin
-
I believe you can geotarget subfolders inside GSC, but you will need to set up each subfolder as a separate site inside GSC. Google has some tips here that might be of help. I would also consider using meta language tags as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best SEO practice for multiple languages in website
HI, We would like to include multiple languages for our global website. What's the best practice to gain from UI and SEO too. Can we have auto language choosing website as per browsing location? Or dedicated pages for important languages like www.website.com/de for German. If we go for latter, how about when users browsing beside language page as they will be usually in English
Intermediate & Advanced SEO | | vtmoz0 -
Sub-directories or Nah
Hey MOz Squad, So I have a locksmith company with 7 locations all up and down the west coast. We set up each location in a sub-directory (mainly because they used to have other websites for these locations and we wanted to try and keep the juice so we 301 redirected back to these subdiretories) But it's making our efforts so tough because every time I want to change something I have to do it seven times! DO the sub directories really matter that munch for rank? One of my partners says google treats it like it own website and trying to rank just a page on a website for a certain city is harder. What do you guys think Keep the subs or ditch em?
Intermediate & Advanced SEO | | Meier0 -
What Are Your Thoughts On Location Targeted Pages?
I have a client that wants to rank for a bunch of locations around his primary location. Say 30 minutes away. So we created a bunch of pages for cities around his location. So far it seems to be working pretty well. That said, I heard from someone else that Google really doesn't like these type of pages anymore and that we are better off with just one location page and list the areas we server on it. What are your thoughts and experiences?
Intermediate & Advanced SEO | | netviper0 -
Slug best practices?
Hello, my team is trying to understand how to best construct slugs. We understand they need to be concise and easily understandable, but there seem to be vast differences between the three examples below. Are there reasons why one might be better than the others? http://www.washingtonpost.com/news/morning-mix/wp/2014/06/20/bad-boys-yum-yum-violent-criminal-or-not-this-mans-mugshot-is-heating-up-the-web/ http://hollywoodlife.com/2014/06/20/jeremy-meeks-sexy-mug-shot-felon-viral/ http://www.tmz.com/2014/06/19/mugshot-eyes-felon-sexy/
Intermediate & Advanced SEO | | TheaterMania0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Best Practice For Company/Client Logo Endorsement
Article: http://searchengineland.com/homepage-sliders-are-bad-for-seo-usability-163496 I came across the following article and somewhat agree with the authors summary.
Intermediate & Advanced SEO | | Mark_Ch
I find sliders a distraction to B2B users and overall offers no SEO benefits. Scenario
As a service provider, over time I have worked with many high profile blue chip comnpanies. As part of my site redesign, I'm looking to show users my client achievements. My initial thoughts are to carry out the following: On the home page I'm looking to incorporate some high profile company logos (similar to http://www.semrush.com) with a hyperlink "more customers" to the right of logo caption. The link will take the user to a dedicated page (www.mydomain.co.uk/customer) showing a comprehensive list of company logos. Questions
#1 Is the above practice good or bad.
#2 Is there a better way to achieve the above Any other practical advise on user experience, social engagement, website speed, etc would be much appreciated. Thanks Mark0 -
Best Way to Consolidate Domains?
Hello, My company has four websites in the same vertical and we're planning to integrate them all on our main company site. So instead of www.siteone.com, www.sitetwo.com, www.sitethree.com, etc. It would be www.branddomain.com/site-one, www.branddomain.com/site-two, etc. I have a few questions... Should we redirect the old domains to the new directories or leave the old domains and stop updating them with new content... Then have the old content, links, etc. 301 to the same content on the new site? Should we literally move all of the content to the new directories? Any tips are appreciated. It's probably pretty obvious that I don't have a ton of technical skills... my development team will be doing the heavy lifting. I just want to be sure we do this correctly from an SEO perspective! Thanks for the help, please let me know if I can clarify anything. E
Intermediate & Advanced SEO | | essdee0 -
Get-targeted homepage for users vs crawlers
Hello there! This is my first post here on SEOmoz. I'll get right into it then... My website is housingblock.com, and the homepage runs entirely off of geo-targeting the user's IP address to display the most relevant results immediately to them. Can potentially save them a search or three. That works great. However, when crawlers frequent the site, they are obviously being geo-targeted for their IP address, too. Google has come to the site via several different IP addresses, resulting in several different locations being displayed for it on the homepage (Mountain View, CA or Clearwater, MI are a couple). Now, this poses an issue because I'm worried that crawlers will not be able to properly index the homepage because the location, and ultimately all the content, keeps changing. And/or, we will be indexed for a specific location when we are in fact a national website (I do not want to have my homepage indexed/ranked under Mountain View, CA, or even worse, Clearwater, MI [no offence to any Clearwaterians out there]). Of course, my initial instinct is to create a separate landing page for the crawlers, but for obvious reasons, I am not going to do that (I did at one point, but quickly reverted back because I figured that was definitely not the route to go, long-term). Any ideas on the best way to approach this, while maintaining the geo-targeted approach for my users? I mean, isn't that what we're supposed to do? Give our users the most relevant content in the least amount of time? Seems that in doing so, I am improperly ranking my website in the eyes of the search engines. Thanks everybody! Marc
Intermediate & Advanced SEO | | THB0