Hreflang implementation issue
-
We are currently handling search for a global brand www.example.com which has presence in many countries worldwide. To help Google understand that there is an alternate version of the website available in another language, we have used “hreflang” tags. Also, there is a mother website (www.example.com/global) which is given the attribution of “x-default” in the “hreflang” tag. For Malaysia as a geolocation, the mother website is ranking instead of the local website (www.example.com/my) for majority of the products.
The code used for “hreflang” tag execution, on a product page, being:
These “hreflang” tags are also present in the XML sitemap of the website, mentioning them below:
<loc>http://www.example.com/my/product_name</loc>
<lastmod>2017-06-20</lastmod>
Is this implementation of “hreflang” tags fine? As this implementation is true across all geo-locations, but the mother website is out-ranking me only in the Malaysia market.
If the implementation is correct, what could be other reasons for the same ranking issue, as all other SEO elements have been thoroughly verified and they seem fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Blog Structure & Hreflang Tags
Hi all, I'm running an international website across 5 regions using a correct hreflang setup. A problem I think I have is that my blog structure is not standardized and also uses hreflang tags for each blog article. This has naturally caused Google to index each of the pages across each region, meaning a massive amount of pages are being crawled. I know hreflang solves and issues with duplication penalties, but I have another question. If I have legacy blog articles that are considered low quality by Google, is that counting against my site once or multiple times for each time the blog is replicated across each region? I'm not sure if hreflang is something that would tell Google this. For example, if I have low quality blog posts: blog/en-us/low-quality-article-1
Intermediate & Advanced SEO | | MattBassos
blog/en-gb/low-quality-article-1
blog/en-ca/low-quality-article-1 Do you think Google is counting this as 3 low quality articles or just 1 if hreflang is correctly implemented? Any insights would be great because I'm considering to cull the international setup of the blog articles and use just /blog across each region.0 -
Panda, rankings and other non-sense issues
Hello everyone I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G: http://www.virtualsheetmusic.com/ Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while. Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it. Take for example this competitors of ours: http://8notes.com They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them. Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first. As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st. So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around. Thoughts???!!!
Intermediate & Advanced SEO | | fablau0 -
Website Redirection Issue
Hi All, Like to know is there any better way to do 301 redirection. My Client whose website name is Online Plants created with OpenCart. Over the period of time he added nearly 10,000's of products and now he is cleaning them ( by grouping similar attribute under one products) which is right way to do. For example , Product A with different size ( X,XL,XXL ) previously had 3 product entry ( A - X, A - XL, A - XXL ) , now he is moving all of them under one. So while moving he is deleting the other two entry. Now whats the best way to inform google . Putting a manual 301 redirection for each and every product is impossible as there are more products. Whats the best way to go ahead on this.
Intermediate & Advanced SEO | | Verve-Innovation1 -
How to implement three languages with a subfolder /blog on a .com domain?
Hi, I'm setting up a blog for a client that has a .com domain. The client is targeting three languages with the subfolder /de, /nl and /en. We've also established that a blog rather be used in a subfolder than subdomain. My question: How should we implement the three languages? Is this gonna be a domain.com/blog/en or domain.com/en/blog or would you maybe don't use subfolders for language at all but let a hreflang do the job?
Intermediate & Advanced SEO | | dexport0 -
Any solutions for implementing 301s instead of 302 redirects in SharePoint 2010?
We have an issue with Google indexing multiples of each page in our sitemap (www.upmc.com). We've tried using rel_canonical, but it appears that GoogleBot is not honoring our canonicals. Specifically, any of the pages Google indexes that end without a file extension, such as .aspx are 302 redirected to a .aspx page. Example - The following pages all respond as 302 redirects to http://www.upmc.com/services/pages/default.aspx http://www.upmc.com/services/ http://www.upmc.com/services http://www.upmc.com/Services/ http://www.upmc.com/Services Has anyone been able to correct this inherent issue with Sharepoint so that the redirects are at least 301s?
Intermediate & Advanced SEO | | Jessdyl0 -
Silo This! Siloing issue with KW targets and multiple categories
I am having a difficult time determining how to silo the content for this website (douwnpour). The issue I am having is that as I see it there are several different top-level keyword targets to put at the top of the silos, however due to the nature of the products they fit in almost every one of the top-level categories. For instance our main keyword term is "Audio Books" (and derivatives thereof). but we also want to target "Audiobook Downloads" and "Books on CD". Due to the nature of the products, almost every product would fit in all 3 categories. It gets even worse when you consider normal book taxonomy. The normal breakdown would be from audiobooks>Fiction(or Nonfiction). Now each product also belongs to one of these categories, as well as "download", "CD", and "Audiobook". And still worse, our navigation menus link every page on the site back to all of these categories (except audiobooks, as we don't really have a landing page for that besides the home page, which is lacking in optimized content, but is linked from every page on the site.) So, I am finding siloing, or developing a cross-linking plan that makes sense very difficult. It's much easier at the lower levels, but at the top things become muddy. Throw in the idea that we may eventually get e-books as well, and it gets even muddier. I have some ideas of how to deal with some of this, such as having the site navigation put in an i frame, instituting basic breadcrumbs, and building landing pages, but I'm open to any advice or ideas that might help, especially with the top level taxonomy structure. TIA!
Intermediate & Advanced SEO | | DownPour0 -
Any Issues with Changing a Page based on IP address?
Building a site and wondering if we have one page that changes depending on where how it is accessed if that is a good / bad idea. Thanks in advance!
Intermediate & Advanced SEO | | nicole.healthline0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0