Implementation advice on fighting international duplicate content
-
Hi All,
Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation.
The situation is that we have 5 sites with similar content. Out of these 5:
- 2 use the same URL stucture and have no suffix
- 2 have a different URL structure with a .html suffix
- 1 has an entirely different URL structure with a .asp suffix
The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level).
4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content.
Is there an easy way to go about this or is the only way a manual addition?
Has anyone had a similar experience?
Your advice will be greatly appreciated.
Many thanks,
Emeka.
-
Unfortunately yes, it is needed to be rerun the process with the tool.
-
Thanks Gianluca,
Have you had experience using the tool above? Presumably each time a new page is added to the site the tool would have to be run again?
I agree that an in-house solution will be best but given the time limit we are open to ideas.
I appreciate your response.
Emeka.
-
When it come to massive sites and hreflang annotations, the ideal solution is implementing the hreflang using the sitemap.xml method.
It is explained here by Google: https://support.google.com/webmasters/answer/2620865?hl=en.
A tool that makes easier to implement hreflang in a sitemap file is the one The Mediaflow created:
http://www.themediaflow.com/tool_hreflang.php.
Right now, that is the only tool I know for that kind of task, so you could also think to create an internal in-house solution, if you have internal developers who can be dedicated to this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO - How do I show correct SERP results in the UK and US?
Hi, Moz community. I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’. If anybody could help with the issues below, I would be very grateful. Thank you all. Issues When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see: UK [Product Name] | [Brand] When checking the Google cache, the UK page version also appears This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results. Current Set-Up Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation We have also added herflang to our UK and US pages Query How do I show our US SERP as opposed to the UK version in US Google search? My Theories/ Answers US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version Ensure hreflang is enabled to point Google to correct local page versions Ensure local backlinks point to localised pages If anyone can help, it will be much appreciated. Many thanks all.
Local Website Optimization | | Katarina-Borovska0 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
Looking for SEO advice
Hi, I have a website that has been built with SEO in mind from the beginning but it's failing to rank at all. We have checked everything and found nothing we see that would be negatively affected the SEO value. The link is http://www.fancydoorsedmonton.com/ Any insight on anything you may find would be greatly appreciated! Thanks!
Local Website Optimization | | Web3Marketing870 -
Duplicate content, hijacked search console, crawl errors, ACCCK.
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel. Of course I was not around to manage the build. I also do not have ftp access I'm also dealing with major search console issues. The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation. The site prefers the www version and does not read the same traffic for the non www version We also have something like 90,000 backlinks from 13 sites. And a shit ton of ghost spam. Help!
Local Website Optimization | | beth_thesomersteam0 -
Choosing the best domain for international website section
Hi, i made a question before but the answer not clarified me yet: https://moz.com/community/q/which-is-the-best-xx-or-com-xx-in-general-and-for-seo I mark as answered inadvertently. I want to know what you think about choosing ".es" or ".com.es" for create a section for Spain and redirect them from the homepage. Same for Mexico.
Local Website Optimization | | NachoRetta
I think that is not so important for SEO, but i am not completely sure about other factors.
Big marks like Toshiba use:
http://www.toshiba**.es**
http://www.toshiba**.com.mx** and Cocacola:
http://www.cocacola**.es**
http://www.coca-cola**.com.mx** ".es" for spain and ."com.mx" for mexico?0 -
UK website to be duplicated onto 2 ccTLD's - is this duplicate content?
Hi We have a client who wishes to have a site created and duplicated onto 3 servers hosted in three different countries. United Kingdom, Australia and USA. All of which will ofcourse be in the English language. A long story short, the website will provide the user 3 options on the homepage asking them which "country site" they wish to view. (I know I can detect the user IP and autoredirect but this is not what they want) Once they choose an option it will direct the user to the appropriate ccTLD. Now the client wants the same information to appear on all 3 sites with some slight variations in products available and English/US spelling difference but for the most part, the sites will look the same with the same content on each page. So my question is, will these 3 sites been seen as duplicates of each other even though they are hosted in different countries and are on ccTLD's? Are there any considerations I should pass onto the client with this approach? Many thanks for reading.
Local Website Optimization | | yousayjump
Kris0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0