How to handle International Duplicated Content?
-
Hi,
We have multiple international E-Commerce websites. Usually our content is translated and doesn't interfere with each other, but how do search engines react to duplicate content on different TLDs?
We have copied our Dutch (NL) store for Belgium (BE) and i'm wondering if we could be inflicting damage onto ourselves...
Should I use: for every page? are there other options so we can be sure that our websites aren't conflicting? Are they conflicting at all?
Alex
-
Hi Alexander,
The hreflang link in the header is probably the best way to do it. As to if it is impacting you, it depends on how much duplicate content there is to some degree. If you have set up both sites in GWT with separate sitemaps you can keep an eye on how well both sites are being indexed, a lot of unindexed pages on one or the other might indicate a problem. Best practice would be to put in the hreflag as you mention if in doubt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need suggestion: What is the best internal linking structure for our website to gain in SEO & UX too?
Hi all, We have 3 different editions of our product we are selling with 20 features. 1st edition & 2nd edition comes with 15 features in which 10 are common in each edition. 3rd edition comes with all 20 features. Now what's the best way to interlink and show the navigational menu to highlight 3 editions and features as well? Much appreciated if some one refer me a website with such structure. Thanks
Web Design | | vtmoz0 -
How to check if the website has duplicate content?
I've been working with the websites from couple of months and it was always in my mind if there could be a legit way to find if the website have a duplicate content. I've tried couple of websites through google but nothing worked for me. It would be much appreciated if anyone can help. Thanks
Web Design | | rajveer_singh0 -
Should i be using shortcodes for my my page content.
Hello, I have a question. Sorry if this is been answered before. Recently I decided to do a little face lift to my main website pages. I wanted to make my testimonials more pretty. Found this great plugin for testimonials which creates shortcodes. I love how it looks like, but just realised that when I use images in shortcodes, these are not picked up by search engines 😞 only text is. Image search ability is pretty important for me and I'm not sure if I should stick with my plain design and upload images manually with all alt tags and title tags or there is a way to adjust shortcode so it shows images to search engines. You can see example here. https://a-fotografy.co.uk/maternity-photographer-edinburgh/ Let me know your thoughts guys. Regards, Armands
Web Design | | A_Fotografy1 -
40 percent redundant content on landing pages with 60 percent unique information.
I have searched schema.org for tags to use for our redudant content on 25 unique local landing pages. The redundant content references our services and abilities on each page. Could anyone tell me how to retain this content and direct the search engines to disregard this portion of the landing page. We are a WordPress site -- if there is a plugin - I would love to know which one might work, although I have not been able to find one that will protect us from duplicate content issues. Thank you in advance.
Web Design | | seant1190 -
Duplicate Titles for Large Lists
Our blog (www.cowleyweb.com/blog) has recently been given topic categories so we can utilize our old blogs. Otherwise, users would only see what's new and never look back (our blogs are organized by the month they were published) and all that hard work would kind of be a waste after a while. So we came up with a few topics (i.e. social media, internet marketing, etc.) and adding those as tags to blogs. Now, users can click the topics and get a results page on our blog of all the previously published blogs related to that topic. Sounds great. BUT, it's hurting our SEO crawl report. If the list goes beyond one page of search results, the 2nd and subsequent pages get dinged as "duplicate title" b/c they share the same title (i.e. "Social Media"). How can I fix this? I'm not the web designer but something tells me maybe some sort of tag that says "Page 2" or something would do the trick. We use Drupal which is good for customization. I assume tons of bloggers and websites have dealt with this problem. Please help. Want to give the web guy some solutions. Thank you.
Web Design | | JCunningham0 -
Question re. crawlable textual content
I have a client who is struggling to fit crawlable textual content on their pages. I'm wondering if we can add a "Learn More..." feature that works as a mouse over pop up. When a page visitor runs their curser over the link or button, a window bubble pops up and textual content about the page will show. Not knowing much about code, can text in this format be crawlable by search engines and count as unique and relevant content? Thanks, Dino
Web Design | | Dino640 -
Managing international sites
Hi all, I am trying to figure out the best way to manage our international sites. We have two locations, 1 in the UK and 1 in the USA. I currently use GEOIP to identify the location of the browser and redirect them using a cookie to index.php?country=uk or index.php?country=usa. Once the cookie is set I use a 301 redirect to send them to index.php, so that Google doesnt see each url as duplicate content, which Webmaster tools was complaining about. This has been working wonderfully for about a year. It means I have a single php language include file and depending on the browser location I will display $ or £ and change the odd ise to ize, etc. Problem I am starting to notice is that we are starting to rank better and better in the USA search result. I am guessing this is because the crawlers must be based out of the USA. This is great, but my concern is that I am losing rank in the UK, which is currently where most of our business is done out of... So I have done my research and because I have a .net will go for a /uk/ or /us/ sub folder and create two separate webmaster tools site and set them up to target each geographic location. Is this okay? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#2 HERE IS THE PROBLEM: I don't was to have to run two separate website with two separate sets of copy. Also, I dont want to lose all the rank data on urls like: http://www.mysite.net/great-rank-result.html now becomes http://www.mysite.net/uk/great-rank-result.html. On top of this I will have two pages, the one just mentioned and now adding http://www.mysite.net/us/great-rank-result.html, which I presume would be seen as duplicate copy? (Y/n) Can I use rel canonical to overcome this? How can I don't this without actually running the two pages. Could you actually have 1 site in the root folder and just use the same GEOIP techology to do a smart MOD REWRITE adding either UK or US to the url therefore being able to create two webmaster accounts targeting each geographic location? Any advise is most welcome.
Web Design | | Mediatomcat0 -
Duplicate Content for index.html
In the Crawl Diagnostics Summary, it says that I have two pages with duplicate content which are: www.mywebsite.com/ www.mywebsite.com/index.html I read in a Dream Weaver tutorial that you should name your home page "index.html" and then you can let www.mywebsite.com automatically direct the user to index.html. Is this a bug in SEOMoz's crawler or is it a real problem with my site? Thank you, Dan
Web Design | | superTallDan0