Dublicate Content: Almost samt site on different domains
-
Hi,
I own a couple of casting websites, which I'm at the moment launching "local" copies of all over the world.
When I launch my website in a new country, the content is basically allways the same, except the language sometimes changes country for country.
The domains will vary, so the sitename would be site.es for Spain, site.sg for Singapore, site.dk for Denmark and so.
The websites will also feature diffent jobs (castings) and diffent profiles on the search.pages and so, BUT the more static pages are the same content (About us, The concept, Faq, Create user and so).
So my Questions are:
- Is this something that is bad for Google SEO?
- The sites are atm NOT linking to each other with language-flags or anything - Should I do this? Basically to tell google that
the business behind all these sites are somewhat big. - Is there a way to inform Google on, that these sites should NOT be treated as dublicate content (Canonical tag wont do, since I want the "same" content to be listet on the locally Google sites).
Hope there is some experts here which can help.
/Kasper
-
Thanks a lot. I wont change anything then.
-
It sounds like you have a few options. Since you need to geo-target since the content is different and NEEDS to be different in each country, except for the standard About Us, etc, you are on the right track. Using ccTLDs automatically tells Google and Bing that you are geo-targeting. All good there.
You should not need hreflang. Each site's main content is different. It's not just translated. You're fine not marking that up.
For the general content, I would recommend a canonical to the original content. That content won't be useful in the SERPs much for you, mostly branded content, so you shouldn't worry about them appearing for each country. You could just have each page on each ccTLD, but they won't perform well. Again, these are not real important pages, so don't fret too much.
-
Google has made it clear, time and time again, that if a web page is in a different language (it's translated), then it's not considered to be duplicate content. So, we recommend translating it into the appropriate language, it will (should) do just fine in Google and won't have an duplicate content issues.
If, however, there's more than one site that has the same content in the same language, like using English in more than one country (having two English sites but targeting different countries), then your content will need to be unique. If it's not unique, then we recommend using the canonical tag to specify which one Google should use. Using the canonical tag should be a last resort, though, as unique content is going to be best.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the feeliing of "Here's where our site can help" text links used for conversions?
If you have an ecommerce site that is using editorial content on topics related to the site's business model to build organic traffic and draw visitors who might be interested in using the site's services eventually, what is the SEO (page ranking) impact -- as well as the impact on the visitors' perceptions about the reliability of the information on the site -- of using phrases like "Here is where [our site] can help you." in nearly every article. Note: the "our site" text would be linked in each case as a conversion point to one of the site's services pages to get visitors to move from content pages on a site to the sales pages on the site. Will this have an impact on page rankings? Does it dilute the page's relevance to search engines? Will the content look less authoritative because of the prevalence of these types of links? What about the same conversion links without the "we can help" text - i.e., more natural-sounding links that stem from the flow of the article but can lead interested visitors deeper into the ecommerce section of the site?
Algorithm Updates | | Will-McDermott0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
301 from one domain to another. Possible?
Hi all, I'm looking to re-direct one domain to another so that the content can be edited more easily under one CMS. Is this possible or will search engines penalise you for such a move? Not overly worried about losing link juice by implementing a 301 because the website we're hoping to re-direct from is a new page. Cheers
Algorithm Updates | | SwanseaMedicine0 -
Domain Authority Distribution Across the Web
**Does anyone have stats for domain authority distribution across the entire web? E.G., what percentage of websites fall in the DA range of 0-25, 26-50, 51-75, 76-100. **
Algorithm Updates | | Investis_Digital2 -
Duplicate Product Pages On Niche Site
I have a main site, and a niche site that has products for a particular category. For example, Clothing.com is the main site, formalclothing.com is the niche site. The niche site has about 70K product pages that have the same content (except for navigation links which are similar, but not dupliated). I have been considering shutting down the niche site, and doing a 301 to the category of the main site. Here are some more details: The niche sites ranks fairly well on Yahoo and Bing. Much better than the main site for keywords relevant to that category. The niche site was hit with Penguin, but doesn't seem to have been effected much by Panda. When I analyze a product page on the main site using copyscape, 1-2 pages of the niche site do show, but NOT that exact product page on the niche site. Questions: Given the information above, how can I gauge the impact the duplicate content is having if any? Is it a bad idea to do a canonical tag on the product pages of the niche site, citing the main site as the original source? Any other considerations aside from duplicate content or Penguin issue when deciding to 301? Would you 301 if this was your site? Thanks in advance.
Algorithm Updates | | inhouseseo0 -
Our root domain is no longer appearing in search results
Hi all The root domain for our site, roadtrippers.com, has been disappearing from Google's search results. Subfolders and subdomains still appear, but our root domain isn't found at all. I believe I've verified this by searching "-inurl:trips -inurl:byways -inurl:support -inurl:blog -inurl:places -inurl:guides -inurl:destinations site:https://roadtrippers.com/" in Google and our root domain is nowhere to be found. This may or may not be related to another issue we've had, where the root domain is appearing with a seemingly rotating set of parameters. Sometimes it'll be ?mod=, sometimes it'll be ?tag=translation. Originally they appeared to simply displace our ranking root domain, but now they and our root domain are completely disappearing. Our dev team believes they fixed the problem with recent 301 tags to any unapproved parameter being added to the root domain, but this hasn't fixed the original problem. Any insight into this is greatly appreciated! Brandon
Algorithm Updates | | brandonRT0 -
What are you doing differently after Carousel?
Especially Restaurant and Hospitality marketers, have you changed your SEO or (as in my case much of my) overall online strategy for your Hospitality Clients after Carousel? Carousel's functionality as a SERP is so different that it does seem to have larger implications for businesses that will be presented in this SERP format. Now rather than one click to a client's website where you have much more control over the experience, Carousel will create more clicking and comparison shopping within the SERP. Clicking on a Thumbnail in Carousel doesn't click through to the business' website, instead it now returns a new SERP with a search phrase on the business name. As a result, things like online ratings sites (Yelp, Urbanspoon, etc) are much more relevant to the client's overall digital presence. Hence in many cases the need on the part of clients for more digital marketing services. And while on first blush, it appear to potentially be a negative for SEO and a boon to other disciplines, I am currently of the belief that this raises the stakes for SEO for restaurants in competitive markets immeasurably. All of the Carousel SERPs that I've seen have been limited to two panels of horizontal thumbnails. If your restaurant doesn't appear in those results, you're much less likely to have online traffic turn into real world traffic. Interested to hear the input of the community, especially those that specialize in Hospitality Marketing.
Algorithm Updates | | JohnVanDekker0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0