Hhreflang been setup correctly?
-
Hi guys,
We recently setup US hreflang tags across a site. According to Deep Crawl it seems the tags are working.
http://s3.postimg.org/5b8rzq9r6/screenshot_1494.jpg
However when i test it out with http://flang.dejanseo.com.au/
I seem to be getting no language, region and under notes it saids "Reciprocal not found" (does anyone know what this means?)
http://s16.postimg.org/quiiaob1h/screenshot_1495.jpg
Any ideas?
Cheers
-
Hi Chris,
You have 2 issues here:
1. It doesn't make sense to put the us version on http://www.camilla.com.au/us/ - .com.au is an Australian domain extension and therefore geo targeted by default to Australia. If you want to have a us version you'll have to put it on a generic domain extension (.com / .net / .org / etc geo targeted in search console to the us) or use the .us extension
2. Hreflang is on page level - not on domain level - on all pages of your site you put the hreflang pointing to the "us homepage". What you should do is for each page on your site (both us & au version) you will have to put at least two hreflang tags:
Example http://www.camilla.com.au/collection/my-wandering-heart-resort-15 & http://www.camilla.com.au/us/collection/my-wandering-heart-resort-15
On both pages put:
<link rel="alternate" href="http: www.camilla.com.au="" collection="" my-wandering-heart-resort-15" hreflang="en-au"></link rel="alternate" href="http:>
If you want to have one version as default - then also add
(if it's the us version you want as default - check http://googlewebmastercentral.blogspot.be/2013/04/x-default-hreflang-for-international-pages.html)
The message "reciprocal not found" indicates that you only put the hreflang to the us version - and that on the us version there is no hreflang link to the au version.
You can scan all the content on Moz on hreflang - but useful links are:
https://moz.com/blog/hreflang-behaviour-insights http://www.aleydasolis.com/en/international-seo-tools/hreflang-tags-generator/ https://support.google.com/webmasters/answer/189077?hl=enDirk
-
Yes, there is error. I used tool described here:
https://moz.com/blog/open-source-library-tool-check-hreflang
And here is result:
http://hreflang.ninja/check/?url=http%3A%2F%2Fwww.camilla.com.au%2Fus%2Fshop%2Fjust-in.htmlBut alternate page there:
http://www.camilla.com.au/us/
Doesn't link back /us/shop/just-in.htmlAnd this is first problem. Second problem is that /us/ page doesn't like en_us. I'm not sure why? And third is missing x-default hreflang.
In case of doubt please check both examples:
https://play.google.com/store/apps/details?id=com.google.earth&hl=en
https://play.google.com/store/apps/details?id=com.google.earth
and read everything in Moz about hreflang.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt wildcards - the devs had a disagreement - which is correct?
Hi – the lead website developer was assuming that this wildcard: Disallow: /shirts/?* would block URLs including a ? within this directory, and all the subdirectories of this directory that included a “?” The second developer suggested that this wildcard would only block URLs featuring a ? that come immediately after /shirts/ - for example: /shirts?minprice=10&maxprice=20 BUT argued that this robots.txt directive would not block URLS featuring a ? in sub directories - e.g. /shirts/blue?mprice=100&maxp=20 So which of the developers is correct? Beyond that, I assumed that the ? should feature a * on each side of it – for example - /? - to work as intended above? Am I correct in assuming that?
Intermediate & Advanced SEO | | McTaggart0 -
Can you expedite the correction of erroneous information on the Knowledge Graph?
I have a client who is a major player in the continuing education vertical. They have recently noticed that the Google Knowledge Graph is displaying erroneous information whenever somebody searches for their brand name (no matter the location/IP address). The brand Knowledge Graph is pulling information from a permanently closed location. Our client has multiple locations. Why Google decided to pull data from this particular (closed) location is beyond us. Our client has reclaimed the permanently closed location Google+ page and they are going to permanently delete it. However we are wondering if there is anyway to expedite the process of updating the Knowledge Graph. Is there anyway to submit feedback to Google about the KG? Is there anyway to request a Knowledge Graph correction? The erroneous "permanently closed" data is very embarrassing for our client.
Intermediate & Advanced SEO | | RosemaryB0 -
How should I setup schema.org for ecommerce site?
I understand how to do products, what I am more curious about is the organization schema. Is it worth it to set it up as an ecommerce business? I would have to set it up on the About Us page for the site, does it matter to Google that it is not located on the homepage?
Intermediate & Advanced SEO | | EcommerceSite0 -
Correct Schema Markup
Hello, I've been having some markup issues and need some help. I researched what would be the best Business type and used that markup on the website. But when I check it in the testing tool I get this http://screencast.com/t/vtlfn2MNPKf It doesn't recognize the object type. Could this be an error with my website or the markup its self? Please advise Thanks
Intermediate & Advanced SEO | | Rank-and-Grow0 -
Is a 301 Redirect and a Canonical Tag on Uppercase to Lowercase Pages Correct?
We have a medium size site that lost more than 50% of its traffic in July 2013 just before the Panda rollout. After working with a SEO agency, we were advised to clean up various items, one of them being that the 10k+ urls were all mixed case (i.e. www.example.com/Blue-Widget). A 301 redirect was set up thereafter forcing all these urls to go to a lowercase version (i.e. www.example.com/blue-widget). In addition, there was a canonical tag placed on all of these pages in case any parameters or other characters were incorporated into a url. I thought this was a good set up, but when running a SEO audit through a third party tool, it shows me the massive amount of 301 redirects. And, now I wonder if there should only be a canonical without the redirect or if its okay to have tens of thousands 301 redirects on the site. We have not recovered yet from the traffic loss yet and we are wondering if its really more of a technical problem than a Google penalty. Guidance and advise from those experienced in the industry is appreciated.
Intermediate & Advanced SEO | | ABK7170 -
Is this a Correct Time to Use 302 Redirects?
Hi Mozzers! We are going through a rebranding process, and as of this morning we have 3 domains, all with identical content. For example (not real domain names): www.fantastic.com
Intermediate & Advanced SEO | | Travis-W
www.fantasticfireworks.com
www.fireworks.com We are using 3 domains to ease the rebranding transition. We currently only want people to visit 'www.fantastic.com,' and if they visit the other 2 domains we want them redirected. Since we will be using these other domains eventually, should we use 302 redirects instead of 301s? The other domains are new and do not have any domain authority or sites linking in, so we do not need to worry about link juice. Does it really matter what type of redirect we use? Thanks!0 -
Need to duplicate the index for Google in a way that's correct
Usually duplicated content is a brief to fix. I find myself in a little predicament: I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal. The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads. So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines. I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path. So the question is: Have someone else stumbled upon a similar problem? If so...? How did you fix it.
Intermediate & Advanced SEO | | Gustav-Northclick0 -
Any ideas for capturing keywords that your client rejects because they aren't politically correct?
Here's the scenario: you need to capture a search phrase that is very widely used in common search, but the term is considered antiquated, overly vernacular, insensitive or outright offensive within the client's industry. In this case, searchers overwhelmingly look for "nursing homes," but the term has too many negative connotations to the client's customers, so they won't use it on-page. Some obvious thoughts are to build IBLs or write an op-ed/blog series about why the term is offensive. Any other ideas?
Intermediate & Advanced SEO | | Jeremy_FP1