Hreflang for Canadian web visitors (when their browsers are set to en-us)
-
We're in the process of implementing hreflang markup for Canadian & US versions of a website.
We've found that about half of our Canadian traffic has browsers that are set to en-us (instead of en-ca, as would be expected). Should we be concerned that Canadians with en-us browser settings will be shown the US versions of the website (as the hreflang would markup 'en-us' for the US version of the page).
Our immediate thoughts are that since they're likely to be searching from Google.ca and would also have Canadian IP addresses, that this won't be an issue. Does anyone have any other thoughts here?
-
Don't have hard evidence - but from my personal perspective: My browser is set to be-nl (Belgium)- when I'm in the Netherlands (nl-nl) I am automatically redirected to google.nl & all the results I get are from the Netherlands (even for international sites were Dutch Belgian versions exist). Browser language will have an impact - but in my opinion proximity will be more important.
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Parameter Setting Recommendation - Webmaster Tools, Breadcrumbs & 404s
Hi All, We use a parameter called "breadCrumb" to drive the breadcrumbs on our ecommerce product pages that are categorized in multiple places. For example, our "Blue Widget" product may have the following URLs: http://www.oursite.com/item3332/blue-widget
Intermediate & Advanced SEO | | Doug_G
http://www.oursite.com/item3332/blue-widget_?breadCrumb=BrandTree_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree1_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree2_ We use a canonical tag pointing back to the base product URL. The parameter only changes the breadcrumbs. Which of the following, if any, settings would you recommend for such a parameter in GWT: Does this parameter change page content seen by the user? Options: Yes/No
How does this parameter affect page content? Options: Narrows/Specifies/Other Currently, google decided to automatically assign the parameter as "Yes/Other/Let Googlebot Decide" without notifying us. We noticed a drop in rankings around the suspected time of the assignment. Lastly, we have a consistent flow of products that are discontinued that we 404. As a result of the breadcrumb parameter, our 404s increase significantly (one for each path). Would 800 404 crawl errors out of 18k products cause a penalty on a young site? We got an "Increase in '404' pages' email from GWT, shortly after our rankings seemed to drop. Thank you for any advice or suggestions! Doug0 -
How to Set Up Canonical Tags to Eliminate Duplicate Content Error
Google Webmaster Tools under HTML improvements is showing duplicate meta descriptions for 2 similar pages. The 2 pages are for building address. The URL has several pages because there are multiple property listings for this building. The URLs in question are: www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan/page/3 www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan How do I correct this error using canonical tags? Do I enter the URL of the 1<sup>st</sup> page under “Canonical URL” under “Advanced” to show Google that these pages are one and the same? If so, do I enter the entire URL into this field (www.metro-manhattan.com /601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan) or an abbreviated version (/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan)? Please see attached images. Thanks!! Alan rUspIzk 34aSQ7k
Intermediate & Advanced SEO | | Kingalan10 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
If we remove all of the content for a branch office in one city from a web site, will it harm rankings for the other branches?
We have a client with a large, multi-city home services business. The service offerings vary from city to city, so each branch has it's own section on a fairly large (~6,000 pages) web site. Each branch drives a significant amount of revenue from organic searches specific to its geographic location (ex: Houston plumbers or Fort Worth landscaping). Recently, one of the larger branches has decided that it wants its own web site on a new domain because they have been convinced by an SEO firm that they can get better results with a standalone site. That branch wants us to remove all of its content (700-800 pages) on the current site and has said we can 301 all inbound links to the removed content to other pages on the existing site to mitigate any loss to domain authority. The other branch managers want to know if removing this city-specific content could negatively impact search rankings for their cities. On the surface it seems like as long as we have proper redirects in place, the other branches should be okay. Am I missing something?
Intermediate & Advanced SEO | | monkeeboy0 -
Would spiders successfully crawl a page with two distinct sets of content?
Hello all and thank you in advance for the help. I have a coffee company that sell both retail and wholesale products. These are typically the same product, just at different prices. We are planning on having a pop up for users to help them self identify upon their first visit asking if they are retail or wholesale clients. So if someone clicks retail, the cookie will show them retail pricing throughout the site and vice versa for those that identify themselves as wholesale. I can talk to our programmer to find out how he actually plans on doing this from a technical standpoint if it would be of assistance. My question is, how will a spider crawl this site? I am assuming (probably incorrectly) that whatever the "default" selection is (for example, right now now people see retail pricing and then opt into wholesale) will be the information/pricing that they index. So long story short, how would a spider crawl a page that has two sets of distinct pricing information displayed based on user self identification? Thanks again!
Intermediate & Advanced SEO | | ClayPotCreative0 -
Web Developer, Web Designer, SEO Person...what do I need?
I was short on fund for my company when I set up my website. I used a company that was just starting to get into website design and they charged very little. Now that I'm starting to pursue different forms of SEO it has come to my attention (because of SEOmoz reporting) that my website wasn't coded correctly on the backend. For instance the website sometimes uses a trailing slash in the the URL and sometimes it doesn't. This redirects it and creates a duplicate content issue. I need to find someone to audit/evaluate (I guess that's the word) the backend of my website and reprogram it where there are problems. I need to make sure my site is set up in the best way possible to be crawled by search engines. What type of firm do I talk to? A website designer or web developer? Or do I talk to a SEO person who also can code the backend of websites? My website is www.capitolshine.com
Intermediate & Advanced SEO | | CapitolShine0 -
Strange affiliate links in web master tools
Hi All I have been looking in the Google webmasters tools account for a client and from out of nowhere they suddenly have 42,000 links last time I checked it was just over 4,000. It looks to me like some of our affiliates have links showing, they are formatted so it goes via a php script that looks like accomtracking.php?estid=1234&ref=123 this link appears 3 times in the source code of the page that is linking to my client but Google webmaster tools thinks it is linking 12,000 times. I assume this is a problem with the database where the link points to, issuing a unique link for each visit or something similar but I am not sure how to explain it to my client and how to correct it / stop it happening again. There are around 37,000 links like this to the site from just 6 referring sites and the client have had an unnatural link warning so I am anxious to get this fixed. Any help gratefully received. Sean
Intermediate & Advanced SEO | | SKE0 -
How do you prevent the mobile site becoming a duplicate of the full browser site?
We have a larger site with 100k+ pages, we need to create a mobile site which gets indexed in the mobile engines but I am afraid that google bot will consider these pages duplicates of the normal site pages. I know I can block it on the robots.txt but I still need it to be indexed for mobile search engines and I think google has a mobile crawler as well. Feel free to give me any other tips that I should follow while trying to optimize the mobile version. Any help would be appreciated 🙂
Intermediate & Advanced SEO | | pulseseo0