I have a site that has 65 different versions of itself.
-
I've just started managing a site that serves over 50 different countries and the entire web enterprise is being flagged for duplicate content because there is so much of it. What's the best approach to stop this duplicate content, yet serve all of the countries we need to?
-
Yes sir, I agree it will be a "bit of an effort". Thank you both for some great guidance and if there's anybody else that has other solutions to these types of issues, I welcome your feedback as well.
-
It may be a bit of an effort but is it possible that you can work your way through the pages and make the content, titles and descrptions unique so that they don't get flagged as duplicate content.
This has the added advantage of having a large number of pages targeted at your various keyphrases whereas other apporaches involving 301 redirects or rel="nofollow" reduce the duplicate content issue but also reduce the number of pages on which to target keyphrases across all of these pages. If they are acorss 50 countries is there a local spin that can be put on the content so that all the relevant terms are targeted for in their regions but so that Google doesn't see 50+ versions of the same site.
-
Thank You Flatiron,
Yes the content is on different servers due to the different countries they serve as well as the languages. The client's US site is what I am working to improve and they currently have over 2,500 Duplicate title tags and Meta-Descriptions out there. Would modifying the robots.txt file to instruct the SE's to simply crawl the one main site and ignore the others be the best solution? My train of thought is going back to a previous case I had with a previous company where their product list pages were seen as duplicate pages due to the fact that each of the "sort" parameters were being recognized as duplicates by the SE's. We had to write an instruction to only crawl the first sorted results.
-
Hi Ken,
Is the content actually exactly the same but running on different domains? That will determine how to approach this issue. If all the the content is the same you can either utilize 301 redirects or rel=canonical tags to help the engines view the multiple sites as a single site and combine any link juice that's associated with each of the 50 sites. If the content isn't actually duplicitous then it or the page titles are extremely similar. In the long run I would recommend localizing your content so as to not only help from an SEO perspective but to also improve the user experience and hopefully the conversion rates as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is best way to display user reviews in languages different from the page language? (e.g. English reviews on a page in Spanish)
What is best way to display user reviews in languages different from the page language? (e.g. English reviews on a page in Spanish). For the user it would be useful to see these reviews but I am concerned about negative SEO impact.
International SEO | | lcourse
I would not want to invest into having them all translated by human translator. Any suggestions?0 -
MultiRegional site indexing problems
Hello there!!! I have a multiregional site and dealing with some indexing problems. The problem is that google have only indexed our USA site We have: -set up hreflang tags -set up specific subdirectories https://www.website.com/ (en-us site and our main site) https://www.website.com/en-gb https://www.website.com/en-ca https://www.website.com/fr-ca https://www.website.com/fr-fr https://www.website.com/es-es ..... -set up automatic GEO IP redirects (301 redirects) -created a sitemap index and a different sitemap for each regional site -created a google webmaster's tool for each country targeted -created translations for each different language and added some canonicals to the US' site when using English content. The problem is that Google is not indexing our regional sites. I think that the problem is that google is using a US bot when spidering the site, so it will be always redirect to the US version by a 301 redirect. I have used fetch as google with some of our regional folders and asked for "Indexing requested for URL and linked pages", but still waiting. Some ideas?? changing 301 to 302? Really don't know what to do. Thank you so much!!
International SEO | | Alejandrodurn0 -
If domain mapping subfolders to TLD's is it perceived as a fully separate entity/site therafter ?
Hi I take it once you have domain mapped a country specific subfolder to a country specific TLD (for better local region targeting reasons) Google perceives it as a completely separate entity and it no longer shares any of the parent sites domain benefits (such as domain authority etc) so from that point on requires its own dedicated link building etc ? All Best Dan
International SEO | | Dan-Lawrence0 -
Duplicate Content - International Sites - AirBNB
Good morning Just a quick question. Why does AirBNB not get penalised for duplicate content on its sites. For example, the following two urls (and probably more for other countries), both rank appropriately in the google (UK and COM), https://www.airbnb.co.uk/help/getting-started/how-to-travel
International SEO | | joogla
https://www.airbnb.com/help/getting-started/how-to-travel Their are no canonical tags, no Alternative etc If I look at the following https://www.airbnb.co.uk/s/London--United-Kingdom
https://www.airbnb.com/s/London--United-Kingdom They both have alternative to point to the other language versions which I would expect. However they also both point to them selves as canonical. Would this not be duplicate content ? Thanks for your insights Shane0 -
If I redirect based on IP will Google still crawl my international sites if I implement Hreflang
We are setting up several international sites. Ideally, we wouldn't set up any redirects, but if we have to (for merchandising reasons etc) I'd like to assess what the next best option would be. A secondary option could be that we implement the redirects based on IP. However, Google then wouldn't be able to access the content for all the international sites (we're setting up 6 in total) and would only index the .com site. I'm wondering whether the Hreflang annotations would still allow Google to find the International sites? If not, that's a lot of content we are not fully benefiting from. Another option could be that we treat the Googlebot user agent differently, but this would probably be considered as cloaking by the G-Man. If there are any other options, please let me know.
International SEO | | Ben.JD0 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi, I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect (301) the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons of this? Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I understand that this is a common practice to use subdomains or folders to separate different language versions. My question is regarding subfolders. Is it better to have only the subfolder shown (www.example.com/en) or should I also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam
International SEO | | Awaraman1 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0 -
Subdomain hosted in a different country - what are the implications?
Hello, We are looking at creating an eCommerce section to a website and we are just weighing up the options: Magento - host on hour own server - great but it can often be very slow when hosting a shared server. Shopify - hosted solution but hosting is in the US and we are in the UK and shop will be hosted on a subdomain as a result Build our own solution - time consuming and costly There are two issues that have arisen from this situation.... Is it worse for SEO to host your store in a different country or to host in your country but your store potentially run slower? I'm swaying to the side of the argument that says give your users a good and fast experience instead of worrying about where you host the store. Bearing in mind that the main website will be hosted in the UK anyway and it is just the subdomain that will be hosted in the US. Just wondered if anybody has had experience with this or if I'm missing something? All feedback greatly appreciated! Thanks, Elias
International SEO | | A_Q0