How to handle International Duplicated Content?
-
Hi,
We have multiple international E-Commerce websites. Usually our content is translated and doesn't interfere with each other, but how do search engines react to duplicate content on different TLDs?
We have copied our Dutch (NL) store for Belgium (BE) and i'm wondering if we could be inflicting damage onto ourselves...
Should I use: for every page? are there other options so we can be sure that our websites aren't conflicting? Are they conflicting at all?
Alex
-
Hi Alexander,
The hreflang link in the header is probably the best way to do it. As to if it is impacting you, it depends on how much duplicate content there is to some degree. If you have set up both sites in GWT with separate sitemaps you can keep an eye on how well both sites are being indexed, a lot of unindexed pages on one or the other might indicate a problem. Best practice would be to put in the hreflag as you mention if in doubt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Duplicate Content Issue: Mobile vs. Desktop View
Setting aside my personal issue with Google's favoritism for Responsive websites, which I believe doesn't always provide the best user experience, I have a question regarding duplicate content... I created a section of a Wordpress web page (using Visual Composer) that shows differently on mobile than it does on desktop view. This section has the same content for both views, but is formatted differently to give a better user experience on mobile devices. I did this by creating two different text elements, formatted differently, but containing the same content. The problem is that both sections appear in the source code of the page. According to Google, does that mean I have duplicate content on this page?
Web Design | | Dino640 -
Requirements for mobile menu design have created a duplicated menu in the text/cache view.
Hi, Upon checking the text cache view of our home page, I noticed the main menu has been duplicated. Please see: http://webcache.googleusercontent.com/search?q=cache:http://www.trinitypower.com&strip=1 Our coder tells me he created one version for the desktop and one for the mobile version. Duplicating the menu cannot be good for on page SEO. With that said, I have had no warnings reported back from Moz. Maybe the moz bots are not tuned to looks for such a duplication error. Anyway, the reason the coder created a different menu for mobile in order to support the design requirements. I did not like the look and feel of the responsive version created based on the desktop version. Hi solution to this problem is to convert the Mobile version menu into ajax. what do you guys think? Thanks, Jarrett
Web Design | | TrinityPower0 -
Duplicate Content Home Page http Status Code Query
Hi All, We have just redone a site wide url migration (from old url structure to new url structure) and set up our 301's etc but have this one issue whereby I don't know if' it's a problem of not. We have 1 url - www.Domain.co.uk**/** which has been set up to 301 redirect back to www.domain.co.uk However, when I check the server response code, it comes back as 200. So although it appears to visually 301 redirect if I put the url in the tool bar, the status code says different. Could this be seen as a potential duplicate home page potentially and if so , any idea how I could get around it if we can't solve the root cause of it. This is on a cake php framework, thanks PEte
Web Design | | PeteC120 -
Internal Linking
Hi, I have a site which has a Pr5 index page, however level 2 pages only have a PR rank of 3 is this a sign of poor internal linking structure or maybe this is the result of too many on page links? I would appreciate any ideas that you might have! Kyle
Web Design | | kyleNeedham0 -
Duplicate Content Problem on Our Site?
Hi, Having read the SEOMOZ guide and already worried about this previously, I have decided to look further into this. Our site is 4-5 years old, poorly built by a rouge firm so we have to stick with what we have for now. Were I think we might be getting punished is duplicate content across various pages. We have a Brands page, link at top of page. Here we are meant to enter each brand we stock and a little write up on that brands. What we then put in these write ups is used on each brands item page when we click a brand name on the left nav bar. Or when we click a Product Type (eg. Footwear) then click on a brand filter on the left. So this in theory is duplicate content. The SEO title and Meta Description for each brand is then used on the Brands Page and also on each page with the Brands Product on. As we have entered this brand info, you will notice that the page www.designerboutique-online.com/all-clothing/armani-jeans/ has the same brand description in the scroll box at the top as the page www.designerboutique-online.com/shirts/armani-jeans/ and all the other product type pages. The same SEO title and same Meta descriptions. Only the products change from each one. This then applies to each brand we have (at least 15) across about 8 pages. All with different URLs but the same text. Not sure how a 301 or rel: canonical would work for this, as each URL needs to point at specific pages (eg. shirts, shorts etc...). Some brands such as Creative Recreation and Cruyff only sell footwear, so technically I think??? We could 301 to the Footwear/ URL rather than having both all-clothing and footwear file paths? This surely must be down to the bad design? Could we be losing valulable rank and juice because of this issue? And how would I go about fixing it? I want a new site, but funds are tight. But if this issue is so big that only a new site would fix it, then maybe the money would need to come forward. What do people make of this? Cheers Will
Web Design | | YNWA0 -
How do I identify what is causing my Duplicate Page Content problem?
Hello, I'm trying to put my finger on what exactly is causing my duplicate page content problem... For example, SEOMoz is picking up these four pages as having the same content: http://www.penncare.net/ambulancedivision/braunambulances/express.aspx http://www.penncare.net/ambulancedivision/recentdeliveries/millcreekparamedicservice.aspx http://www.penncare.net/ambulancedivision/recentdeliveries/monongaliaems.aspx http://www.penncare.net/softwaredivision/emschartssoftware/emschartsvideos.aspx As you can tell, they really aren't serving the same content in the body of the page. Anybody have an idea what might be causing these pages to show up as Duplicate Page Content? At first I thought it was the photo gallery module that might be causing it, but that only exists on two of the pages... Thanks in advance!
Web Design | | BGroup0 -
Real Estate and Duplicate Content
Currently we use an MLS which is an iFrame of property listings. We plan to pay an extra fee and have the crawlable version. But one problem is that many real estate firms have access to the same data, which makes our content duplicate of theirs. Is there any way around this ? Thanks
Web Design | | SGMan0