Question about region codes and Hreflang?
-
A client (see example above) has accidentally place region codes into the hreflang when the content is intended for all audiences that speak the language. So "fr-fr" should really just be "fr" since those that are "fr-be", "fr-ca", and "fr-ch" should all be getting to the French version of the website too. And there isn't a specific subdirectory for French speakers in Belgium or France or Switzerland, etc.
However, when looking at Google Analytics, these region codes don't seem to be stopping those from other regions from getting to the correct landing page. So a user from Belgium is still getting to https://www.example.com/fr/ depsite the "fr-fr" in the hreflang.
So question: is it worth adjusting the hreflang to be non-region specific (from
-
Hreflang tags are essential for indicating to search engines the language and geographical targeting of your web pages. They help search engines serve the most relevant version of your content to users based on their language and location preferences. (study abroad)
Region codes, in this context, are two-letter country codes that you can use in conjunction with hreflang tags to specify the target audience for a particular page. These codes follow the ISO 3166-1 alpha-2 standard and are used to indicate the country or region to which your content is specifically tailored.
For example, if you have a web page with content in English but want to target users in the United States and the United Kingdom, you would use hreflang tags like this:
html
Copy code
<link rel="alternate" hreflang="en-US" href="https://www.example.com/us/page" />
<link rel="alternate" hreflang="en-GB" href="https://www.example.com/uk/page" />
In this example:en-US specifies that the page is intended for English-speaking users in the United States.
en-GB specifies that the page is intended for English-speaking users in the United Kingdom.
Hreflang tags help search engines understand the intended audience for your content and improve the user experience by delivering the most relevant version of your page in search results.Remember to implement hreflang tags correctly and consistently across your international web pages to ensure that search engines accurately understand your targeting preferences and display the appropriate pages to users in different regions.
-
Hello
I am facing a Problem. My website DA is very Low. Someone can help me how I can increase my website DA. -
Thanks For The Great Guide.
-
I would adjust it personally. If for no other reason than, if someone analyses the site later on they should be able to get a good idea of your strategy. Even if you come back to the site later and re-crawl it. Leaving it as it is will inevitably cause strategic confusion down the line. Just set it to your original vision, keep all the signals under your control pointing in a single direction
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Text to Code Ratio & SEO
Hi Has anyone had experience of updating their text to code ratio if its too high & whether this has much impact on SEO performance? I am trying to prioritise tasks & wondered if this is something which should be higher on my list. Thank you 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
Content Audit Questions
Hi Mozzers Having worked on my companies site for a couple of months now correcting many issues, im now ready to begin looking at a content review, many areas of the site contain duplicate content, the main causes being 1. Category Page Duplications
Intermediate & Advanced SEO | | ATP
e.g.
Widget Page Contains ("Blue Widget Extract")
Widget Page Contains ("Red Widget Extract")
Blue Widget Page Contains ("Same Blue Widget Extract")
Red Widget Page Contains ("Same Red Widget Extract") 2. Product Descriptions
Item 1 (Identical to item 2 with the exception of a few words and technical specs)
Item 2 Causing almost all the content on the site to get devalued. Whilst i've cleared all moz errors and warnings im certain this is causing devaluation of most of the website. I was hoping you could answer these questions so I know what to expect once i have made the changes. Will the pages that had duplicate content recover once they possess unique content or should i expect a hard and slow climb back? The website has never receive any warnings from Google, does this mean recovery for penalties like duplicate content will be quicker Several pages rank on page 1 for fairly competitive keywords despite having duplicate content and keyword spammy content. What are the chances of shooting myself in the foot by editing this content? I know I will have to wait for google to crawl the pages before i see any reflection in the changes, but how long after google has crawled the page should I get a realistic idea of how positive the changes were? As always, thanks for you time!0 -
Code Monitor Recommendations
Hi all, I was wondering if you have any recommendations for a code monitor? We'd like to keep track of any code and content changes on a couple of websites. We've taken a look at Page Monitor: https://chrome.google.com/webstore/detail/page-monitor/pemhgklkefakciniebenbfclihhmmfcd?hl=en but I'm not sure if it tracks code changes? Any suggestions for free or paid tools would be appreciated. Edit: We'd also like to avoid a tool that requires any tracking code changes or anything that involves a database/FTP connection.
Intermediate & Advanced SEO | | ecommercebc0 -
To merge or not to merge? That is the question.
I am planning to do something I never did, and I am wondering if it's really a good idea or not. I have four websites, all of the same company, each one with a different domain and different content: one has been the main official site for 16 years, 200 unique per month, indexed for 134 keywords, Domain Authority 17, 13 linking root domains one has been used as the main site from 2003 to 2006, it's focused on a specific business they actually discontinued, still online, no update since 2006, 500 unique per month, indexed for 92 keywords, Domain Authority 13, 8 linking root domains another has been a built on 2010 and maintained for less than year, and it's focused on a business they never really started, still online, no update since 2010, 3000 unique per month, indexed for 557 keywords, Domain Authority 25, 84 linking root domains a fourth one has been also built on 2010 and focused on a business never really started, still online, no update since 2010, 100 unique per month, indexed for 4 keywords, Domain Authority 6, 3 linking root domains Each website has traffic and links, all links being natural, they never tried to gain links in any way, they never did on page optimization, they never ever thought about SEO. They are not event interlinked. So, my idea is to merge all of them, putting websites 2, 3 and 4 as subfolders of the main site and replicating the old content there. Because those sites have traffic, incredibly one of the abandoned sites has 3000 unique per month, while the main site just 200! My doubts are: does it make sense to merge everything from a SEO prospective? A part from doing 301 correctly, what else should I be careful to do or not to do? website number 4 it's really outdated, content and structure is not easy to merge with the rest, traffic is really small, is it worth spending the time to merge it? Finally I also have a problem; customer didn't want to merge them, they agreed to, but they don't want visitors of the main site to be able to navigate to the old ones, so once moved and redirected I would have to put them in the sitemap of the main site but avoid linking to them on the actual "main" site. As far as I know google crawler doesn't like to find pages in sitemaps which are not reachable through a linking path on the website, is that correct? Is that going to make all the merging work useless? Should I convince the client to at least put small links in the footer or on a page linked from the footer?
Intermediate & Advanced SEO | | max.favilli0 -
Images Returning 404 Error Codes. 301 Redirects?
We're working with a site that has gone through a lot of changes over the years - ownership, complete site redesigns, different platforms, etc. - and we are finding that there are both a lot of pages and individual images that are returning 404 error codes in the Moz crawls. We're doing 301 redirects for the pages, but what would the best course of action be for the images? The images obviously don't exist on the site anymore and are therefore returning the 404 error codes. Should we do a 301 redirect to another similar image that is on the site now or redirect the images to an actual page? Or is there another solution that I'm not considering (besides doing nothing)? We'll go through the site to make sure that there aren't any pages within the site that are still linking to those images, which is probably where the 404 errors are coming from. Based on feedback below it sounds like once we do that, leaving them alone is a good option.
Intermediate & Advanced SEO | | garrettkite0 -
Usage of HTTP Status Code 303
Hello, is there anybody who has got some experience with 303 HTTP Status Code? Our software development would like to use 303 "See Others" instead of 301 for redirecting old product-links to the site-root, instead of showing 404 errors. What is the best practise for redirecting old product links which are gone in online-shop context? Best regards Steffen
Intermediate & Advanced SEO | | steffen_0 -
$1,500 question
I have $1,500 to spend to promote 8 years old website. Almost no SEO work was done for the site in the past 3-4 years. The site has a couple hundreds (around 300) external backlinks pointing to the homepage, and around 30 backlinks pointing to internal pages. It gets around 60% traffic from referring sites, 30% direct, and 10% from SE. The homepage has PR 4. It ranks around 70th place in Google rankings for one of the main keywords. No keyword research has been done for the site. Looking for long term benefits. What would be the best way, in your opinion, to spend this money?
Intermediate & Advanced SEO | | _Z_0