International Targeting | Language > 'fa-ir' - no return tags
-
I see this error in search console :International Targeting | Language > 'fa-ir' - no return tagsURLs for your site and alternate URLs in 'fa-ir' that do not have return tags.and it is really increasingi do not know what is the problem and what I have done wrong?
Originating URL Crawl date Alternate URL 1 /abadan/%D8%A2%D8%A8%D8%A7%D8%AF%D8%A7%D9%86/browse/vehicles/?place=8,541&v01=0,1&saveLoc=1 11/16/16 http://divar.ir/ -
Usually these kind of mistakes are caused by not using a canonical URL as href in the hreflang annotation.
For instance, if the url is www.domain.com/product-a?=color&id="red" and that same URL is canonicalized toward www.domain.com/product-a, but the hreflang suggest has alternate href an URL like this: www.domain.com/en/product-a?=color&id="red" as alternate URL for English speaking users, and this one too been canonicalized to its own non parametered version (www.domain.com/en/product-a), then Google will start telling that it doesn't see any return url... why, because in the canonical URL the href is not www.domain.com/en/product-a?=color&id="red" but www.domain.com/en/product-a.
The same is true in the case of the self referential hreflang annotation: it must be the canonical URL the one in the href element of the hreflang annotation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
X Default on hreflang tags
Hi guys, I would like to clarify something about hreflang markups and most importantly, x-default. Sample URLs:
Intermediate & Advanced SEO | | geekyseotools
http://www.example.com/au/collection/dresses (Australia)
http://www.example.com/us/collection/dresses (United States)
http://www.example.com/uk/collection/dresses (United Kingdom) Sample Markups: Questions:
1. Can I use my AU page as x default? I noticed that some x default are US. Note that my biggest market is AU though.
2. If I indeed use AU page as x default, and the user is searching from China, does it mean that Google will return my AU page?
3. Can you spot any issues with these markups I made? Anything that I need to correct. Keen to hear from you! Cheers,
Chris0 -
Using hreflang for international pages - is this how you do it?
My client is trying to achieve a global presence in select countries, and then track traffic from their international pages in Google Analytics. The content for the international pages is pretty much the same as for USA pages, but the form and a few other details are different due to how product licensing has to be set up. I don’t want to risk losing ranking for existing USA pages due to issues like duplicate content etc. What is the best way to approach this? This is my first foray into this and I’ve been scanning the MOZ topics but a number of the conversations are going over my head,so suggestions will need to be pretty simple 🙂 Is it a case of adding hreflang code to each page and creating different URLs for tracking. For example:
Intermediate & Advanced SEO | | Caro-O
URL for USA: https://company.com/en-US/products/product-name/
URL for Canada: https://company.com/en-ca/products/product-name /
URL for German Language Content: https://company.com/de/products/product-name /
URL for rest of the world: https://company.com/en/products/product-name /1 -
Does >70 character title tag affect a pages ranking in search?
We are a publication that puts out hundreds of articles a month. We have +5000 medium priority errors showing that our title element tags are too long. The title tag is structured like this: [Headine] | [Publication Name that is 23 characters] . However, since we are a publication, it's not practical for us to try to limit the length of our title tags to 70 characters or less because doing so would make the titles of our content seem very unnatural. We also don't want to remove the branding because we want it to go with the article when it's shared (and to appear when some titles are short enough to allow room in SERPs). I understand the reasons for limiting characters to 70 or less with regard to SERP friendliness. We try to keep key phrases in the front. People are more likely to click on a page if they know what it's about etc etc. My question is, do the longer titles affect the ability for the page to rank in search? To put it a different way, if we altered all the +5000 of the title tags to fit within 70 characters, would the page authorities and our site's domain authority increase? I'd like to avoid needed to clean up 5000 pages if the medium priority errors aren't really hurting us. Any input is appreciated. Thanks!
Intermediate & Advanced SEO | | CatBrain1 -
International Domains for SEO
My company is international and we have websites for each country with Country Code Top Level Domains (ccTLD). I am in the US and I am seeing that other countries such as Costa Rica and Germany are ranking above us in search results. I thought Google automatically geo-targeted users by default and therefore I should only get .com or US results. Any idea why other countries would rank above our site?
Intermediate & Advanced SEO | | fastlaneus0 -
Problems with Squarespace Title Tags
Hi All, I'm having problems editing the title tags on individual pages on Squarespace. It seems the only way to do it is via the page title name. Here is an example: http://www.autismsees.com/research/. The page is called research, so it makes that the meta title. The problem is I want to keep research on the page and the Meta Title be: Autism Spectrum Research. I'v tried searching over the web, but no luck so far. Thanks for your help.
Intermediate & Advanced SEO | | PeterRota0 -
If I had an issue with a friendly URL module and I lost all my rankings. Will they return now that issue is resolved next time I'm crawled by google?
I have 'magic seo urls' installed on my zencart site. Except for some reason no one can explain why or how the files were disabled. So my static links went back to dynamic (index.php?**********) etc. The issue was resolved with the module except in that time google must have crawled my site and I lost all my rankings. I'm nowher to be found in the top 50. Did this really cause such an extravagant SEO issue as my web developers told me? Can I expect my rankings to return next time my site is crawled by google?
Intermediate & Advanced SEO | | Pete790 -
How Long Before a URL is 'Too Long'
Hello Mozzers, Two of the sites I manage are currently in the process of merging into one site and as a result, many of the URLs are changing. Nevertheless (and I've shared this with my team), I was under the impression that after a certain point, Google starts to discount the validity of URLs that are too long. With that, if I were to have a URL that was structured as follows, would that be considered 'too long' if I'm trying to get the content indexed highly within Google? Here's an example: yourdomain.com/content/content-directory/article and in some cases, it can go as deep as: yourdomain.com/content/content-directory/organization/article. Albeit there is no current way for me to shorten these URLs is there anything I can do to make sure the content residing on a similar path is still eligible to rank highly on Google? How would I go about achieving this?
Intermediate & Advanced SEO | | NiallSmith0 -
Large scale geo-targeting?
Hi there. We are an internet marketing agency and recently did a fair amount of working trying to optimise for a number of different locations. Although we are based in Preston (UK), we would like to attract clients from Manchester, Liverpool, etc. We created landing pages for each of the locations that we wanted to target and each of the services - so we had an SEO Manchester page and a Web Design Manchester page for example. These were all written individually by a copywriter in order to avoid duplicate content. An example of one of the first of these pages is here: http://www.piranha-internet.co.uk/places/seo-blackpool.php We created a 'where we cover' page and used a clickable map rather than huge long list of text links, which we felt would be spammy, to link through to these pages. You can see this page here: http://www.piranha-internet.co.uk/where-we-cover.php Initially we gained a great deal of success from this method - with the above Blackpool page ranking #7 for "SEO Blackpool" within a week. However these results quickly disappeared and now we don't rank at all, though the pages remain in the index. I'm aware that we don't have many external links pointing to these pages, but this cannot explain why these pages don't rank at all, as some of the terms are relatively non-competitive. A number of our competitors rank for almost all of these terms, despite their pages being exact duplicates with simply the city/town name being changed. Any ideas where we've gone wrong?
Intermediate & Advanced SEO | | Piranha_Solutions0