Unique content for international SEO?
-
Hi Guys,
We have a e-commerce store on generic top-level domain which has 1000s of products in US.
We are looking to expand to aus, uk and canda using subfolders.
We are going to implement hreflang tags.
I was told by our SEO agency we need to make all the content between each page unique.
This should be fine for cateogry/product listing pages. But they said we need to make content unique on product pages. If we have 1000 products, thats 4000 pages, which is a big job in terms of creating content.
Is this necessary? What is the correct way to approach this, won't the hreflang tag be sufficent to prevent any duplicate content issues with product pages?
Cheers.
-
Hi There,
This is exactly what the hreflang tag is for, to tell Google that each subfolder and page is targeting a different country and possibly language, and so is not duplicate content.
The SEO agency is not giving you the right information so you should fire them!
If a pages exists like this:
yourstore.com/au/category/product-1
yourstore.com/us/category/product-1
yourstore.com/ca/category/product-1
yourstore.com/uk/category/product-1As long as every page has an hreflang tag pointing to itself and the other versions you will be absolutely fine, no duplication, no worries!
Read More
https://support.google.com/webmasters/answer/189077?hl=en I hope that helps,
Regards
Nigel
-
Hi.
It's common for websites to provide similar or the same content in different languages when targeting different regions while having different URLs.
Google is okay with this as long as the users are from different countries. Your web will not be penalized when translation is manual and accurate. Even though Google still prefers unique content for each version. It understands that having unique content can be quite tough. Google clearly states that you dont need to hide such content by not allowing Google to crawl it using a robots.txt (for example).
The circumstances are entirely different if you're providing the same content to the same audience through two URLs. Imagine you've created yourbusiness.com and yourbusiness.com.au. One targets the USA and other targets Australia respectively. Since both are in English, this will cause duplicate content. Luckily, it can be easily solved using a hreflang tag, which is widely accepted by all search engines globally.
The hreflang tag protects international SEO campaigns from being penalized with duplicate content. It's usually required by businesses that cater to different languages or countries through sub-domains, subfolders, or ccTLD. The hreflang tag also is important if you have multiple languages for one single targeted country.
How to implement it:
1. We must handle language targeting. You'll have to list out the URLs that have equivalents in different languages. Any stand-alone or non-equivalent URLs would not need the hreflang tag, so don't list them.
2. Now comes setting up the tag. This is what a general hreflang tag looks like:
Let's envision that the page in question is www.mysite.com/page2.html and you want a Spanish version of it.
You'll simply change it to
For having a site that targets different countries in same language, you'll use code like:
3. Please note that the hreflang tag should only be placed before the closing of the tag and the tag of self-page shouldn't be added. For example, the page
http://en-gb. xyz.com/page.html
Should only contain the alternate versions like
and for page http://en-us. xyz.com/page.html/, the tag should be.
_ />_
_Try to use: https://www.aleydasolis.com/english/international-seo-tools/hreflang-tags-generator/
Its very easy :)_
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Infinite Scrolling work with unique URLS as users scroll down? And is this SEO friendly?
I was on a site today and as i scrolled down and viewed the other posts that were below the top one i read, i noticed that each post below the top one had its own unique URL. I have not seen this and was curious if this method of infinite scrolling is SEO friendly. Will Google's spiders scroll down and index these posts below the top one and index them? The URLs of these lower posts by the way were the same URLs that would be seen if i clicked on each of these posts. Looking at Google's preferred method for Infinite scrolling they recommend something different - https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html . Welcome all insight. Thanks! Christian
Intermediate & Advanced SEO | | Sundance_Kidd0 -
Content Audit Questions
Hi Mozzers Having worked on my companies site for a couple of months now correcting many issues, im now ready to begin looking at a content review, many areas of the site contain duplicate content, the main causes being 1. Category Page Duplications
Intermediate & Advanced SEO | | ATP
e.g.
Widget Page Contains ("Blue Widget Extract")
Widget Page Contains ("Red Widget Extract")
Blue Widget Page Contains ("Same Blue Widget Extract")
Red Widget Page Contains ("Same Red Widget Extract") 2. Product Descriptions
Item 1 (Identical to item 2 with the exception of a few words and technical specs)
Item 2 Causing almost all the content on the site to get devalued. Whilst i've cleared all moz errors and warnings im certain this is causing devaluation of most of the website. I was hoping you could answer these questions so I know what to expect once i have made the changes. Will the pages that had duplicate content recover once they possess unique content or should i expect a hard and slow climb back? The website has never receive any warnings from Google, does this mean recovery for penalties like duplicate content will be quicker Several pages rank on page 1 for fairly competitive keywords despite having duplicate content and keyword spammy content. What are the chances of shooting myself in the foot by editing this content? I know I will have to wait for google to crawl the pages before i see any reflection in the changes, but how long after google has crawled the page should I get a realistic idea of how positive the changes were? As always, thanks for you time!0 -
Local SEO tips
Hi! I'm working on http://www.hgsplumbingandheating.co.uk/ at the moment, and it's going pretty well for local terms such as 'Norwich plumber' etc. Has anyone got any tips though on how it can be improved, especially with regard to getting the homepage ranking above the Places listings? Thanks!
Intermediate & Advanced SEO | | neooptic0 -
SEO for eCommerce?
I'm working on a game plan for the on-page optimization for a growing e-commerce site (https://www.boutine.com) and I'm wondering if anyone has any experience with similar projects. Specifically, how to get the most SEO value out of product and category pages. Thanks in advance! -Adam
Intermediate & Advanced SEO | | boutine0 -
Ajax Content Indexed
I used the following guide to implement the endless scroll https://developers.google.com/webmasters/ajax-crawling/docs/getting-started crawlers and correctly reads all URLs the command "site:" show me all indexed Url with #!key=value I want it to be indexed only the first URL, for the other Urls I would be scanned but not indexed like if there were the robots meta tag "noindex, follow" how I can do?
Intermediate & Advanced SEO | | wwmind1 -
Hit by Penguin, Can I move the content from the old site to a new domain and start again with the same content which is high quality
I need some advice please. My website got the unnatural links detected message and was hit by penguin.. hard. Can I move the content from the current domain to a new domain and start again or does the content need to be redone also. I will obviously turn of the old domain once its moved. The other option is to try and identify the bad links and change my anchor profile which is a hit and miss task in my opinion. Would it not be easier just to identify the good links pointing to the old domain and get those changed to point to the new domain with better anchors. thanks Warren
Intermediate & Advanced SEO | | warren0071 -
Any SEO suggestions for my site?
Site in question: http://bit.ly/Lcspfp Does anyone have any suggestions for any on-site SEO that would benefit my website? Any recommendations, big or small are appreciated.
Intermediate & Advanced SEO | | RichardTaylor1 -
A very basic seo question
Sorry, been a long day and wanted a second opinion on this please.... I am developing an affiliate store which will have dozens of products in each category. We will not be indexing the product pages themselves as they are all duplicate content. The plan is to have just the first page of the category results indexed as this will have unique content about the products in that section. The later pagnated pages (ie pages 2,3,4,5 etc) will have 12 products on each but no unique content. Would the best advice be to add a canonical tag to all pages in the 'chairs' category pointing to the page with the first 12 results and the descriptions? This would ensure that the visitors are able to browse many pages of product but google won't index products 13 and onwards. Am I right in my thinkings? A supplemental question. What is the best way to block google from indexing/crawling 90,000 product listings which are pulled direct from the merchant so are not unique in the least. I have previous played with banning google from the product folder but it reports health issues in webmaster tools. Would the best route be a no index tag on all the product pages and to no follow all the products in the category listings? Many thanks Carl
Intermediate & Advanced SEO | | Grumpy_Carl0