Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
US and UK Websites of Same Business with Same Content
-
Hello Community,
I need your help to understand, whether I can use the US website's content on my UK website or not?
US Website's domain: https://www.fortresssecuritystore.com
UK Website's domain: https://www.fortresssecuritystore.co.uk
Both websites are having same content on all the pages, including testimonials/reviews.
I am trying to gain business from Adwords and Organic SEO marketing.
Thanks.
-
Yup, but doesn't matter. Hreflang works for this situation whether cross-domain or on a subdirectory/subdomain basis (and in fact is even more effective when cross-domain as you're also getting the benefit of the geo-located ccTLD.)
P.
-
Hi Paul,
If I understood correctly, we are talking about two different websites, not a website with subdomains.
Hreflang can be used for other languages and countries although not for masking 100% duplicated content as I stated above.site A: https://www.fortresssecuritystore.com
site B: https://www.fortresssecuritystore.co.uk
The recommendations that Google gives are for the purpose of having the pages crawled and indexed not for having success with 100% duplicate content which do not serve a good UX, therefore gain a high bounce rate, then the overall SEO fall down.
Mª Verónica
-
Unfortunately, your information is incorrect, Veronica.
Hreflang is specifically designed for exactly this situation. As Google Engineer Maile Oye clearly states, one of the primary uses of hreflang markup is:
- Your content has small regional variations with** similar content in a single language**. For example, you might have English-language content targeted to the US, GB, and Ireland.
(https://support.google.com/webmasters/answer/189077?hl=en)
There's no question differentiating similar content in the same language for different regions/countries is more of a challenge than for totally different languages, but it can absolutely be done, and in fact is a very common requirement for tens of thousands of companies.
Paul
- Your content has small regional variations with** similar content in a single language**. For example, you might have English-language content targeted to the US, GB, and Ireland.
-
Hi CommercePundit,
Sadly, there is not "a non painful way to say it".
You cannot gain business from Adwords and Organic SEO marketing having 100% duplicated content.The options; canonical and hreflang would not work in this case.
The only option is language "localization", mean rewrite the whole content by a local writer.
Canonical can be used for up to 10% not for the whole 100%. Hreflang can be used for other languages and countries although not for masking 100% duplicated content.
Sorry to tell the bad news. Good luck!
Mª Verónica
-
The more you can differentiate these two sites, the better they will each perform in their own specific markets, CP.
First requirement will be a careful, full implementation of hreflang tags for each site.
Next, you'll need to do what you can to regionalise the content - for example changing to UK spelling for the UK site content, making sure prices are referenced in pounds instead of dollars, changing up the language to use British idioms and locations as examples where possible. It'll also be critical to work towards having the reviews/testimonials from each site's own country, rather than generic, This will help dramatically from a marketing standpoint and also help differentiate for the search engines, so a double win.
And finally, you'll want to make certain you've set up each in their own Google Search Console and used the geographic targeting for the .com site to specify its target as US. (You won't' need to target the UK site as the .co.uk is already targeted so you won't' get that option in GSC.). If you have an actual physical address/phone in the UK, would also help to set up a separate Google My Busines profile for the UK branch.
Bottom line is - you'll need to put in significant work to differentiate the sites and provide as many signals as possible for which site is for which country in order to help the search engines understand which to return in search results.
Hope that all makes sense?
Paul
-
Hi!
Yeap you can target UK market with US site version. Always keep in mind that its possible that you might perform as well as in the main market (US).
Also, before making any desition and/or implementing, take a look at these articles:
Multi-regional and multilingual sites - Google Search Console
International checklist - Moz Blog
Using the correct hreglang tag - Moz Blog
Guide to international website expansion - Moz Blog
Tool for checking hreflang anotations - Moz BlogHope it helps.
Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with old content after 301 redirect
I'm going through all our blog and FAQ pages to see which ones are performing well and which ones are competing with one another. Basically doing an SEO content clean up. Is there any SEO benefit to keeping the page published vs trashing it after you apply a 301 redirect to a better performing page?
Technical SEO | | LindsayE0 -
Will Google crawl and rank our ReactJS website content?
We have 250+ products dynamically inserted and sorted on our site daily (more specifically our homepage... yes, it's a long page). Our dev team would like to explore rendering the page server-side using ReactJS. We currently use a CDN to cache all the content, which of course we would like to continue using. SO... will Google be able to crawl that content? We've read some articles with different ideas (including prerendering): http://andrewhfarmer.com/react-seo/
Technical SEO | | Jane.com
http://www.seoskeptic.com/json-ld-big-day-at-google/ If we were to only load the schema important to the page (like product title, image, price, description, etc.) from the server and then let the client render the remaining content (comments, suggested products, etc.), would that go against best practices? It seems like that might be seen as showing the googlebot 1 version and showing the site visitor a different (more complete) version.0 -
Car Dealership website - Duplicate Page Content Issues
Hi, I am currently working on a large car dealership website. I have just had a Moz crawl through and its flagging a lot of duplicate page content issues, these are mostly for used car pages. How can I get round this as the site stocks many of the same car, model, colour, age, millage etc. Only unique thing about them is the reg plate. How do I get past this duplicate issue if all the info is relatively the same? Anyone experienced this issue when working on a car dealership website? Thank you.
Technical SEO | | karl621 -
Set Canonical for Paginated Content
Hi Guys, This is a follow up on this thread: http://moz.com/community/q/dynamic-url-parameters-woocommerce-create-404-errors# I would like to know how I can set a canonical link in Wordpress/Woocommerce which points to "View All" on category pages on our webshop.
Technical SEO | | jeeyer
The categories on my website can be viewed as 24/48 or All products but because the quanity constantly changes viewing 24 or 48 products isn't always possible. To point Google in the right direction I want to let them know that "View All" is the best way to go.
I've read that Google's crawler tries to do this automatically but not sure if this is the case on on my website. Here is some more info on the issue: https://support.google.com/webmasters/answer/1663744?hl=en
Thanks for the help! Joost0 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
How do we keep Google from treating us as if we are a recipe site rather than a product website?
We sell food products that, of course, can be used in recipes. As a convenience to our customer we have made a large database of recipes available. We have far more recipes than products. My concern is that Google may start viewing us as a recipe website rather than a food product website. My initial thought was to subdomain the recipes (recipe.domain.com) but that seems silly given that you aren't really leaving our website and the layout of the website doesn't change with the subdomain. Currently our URL structure is... domain.com/products/product-name.html domain.com/recipes/recipe-name.html We do rank well for our products in general searches but I want to be sure that our recipe setup isn't detrimental.
Technical SEO | | bearpaw0 -
How to tell if PDF content is being indexed?
I've searched extensively for this, but could not find a definitive answer. We recently updated our website and it contains links to about 30 PDF data sheets. I want to determine if the text from these PDFs is being archived by search engines. When I do this search http://bit.ly/rRYJPe (google - site:www.gamma-sci.com and filetype:pdf) I can see that the PDF urls are getting indexed, but does that mean that their content is getting indexed? I have read in other posts/places that if you can copy text from a PDF and paste it that means Google can index the content. When I try this with PDFs from our site I cannot copy text, but I was told that these PDFs were all created from Word docs, so they should be indexable, correct? Since WordPress has you upload PDFs like they are an image could this be causing the problem? Would it make sense to take the time and extract all of the PDF content to html? Thanks for any assistance, this has been driving me crazy.
Technical SEO | | zazo0 -
Universal Business Listing?
Can anyone recommend the best or a better alternative to submitting a clients site to multiple directories than Universal Business Listing? Is UBL the best or is there something better and/or less expensive out there? https://ubl.org Thanks
Technical SEO | | fun52dig0