Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • SEO Q&A
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. Duplicate content on subdomains.

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Duplicate content on subdomains.

    Intermediate & Advanced SEO
    5
    15
    15227
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • HiteshBharucha
      HiteshBharucha last edited by

      Hi Mozer's,

      I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com.

      So, I want to know how can i avoid content duplication.

      Many Thanks!

      1 Reply Last reply Reply Quote 0
      • Dr-Pete
        Dr-Pete Staff last edited by

        It would probably be better (and more likely to get you responses) if you started a new question - this one is three years old. Generally, I think it depends on your scope. If you need some kind of separation (corporate, legal, technical), then separate domains or sub-domains may make sense. They're also easier to target, in some ways. However, you're right that authority may be diluted and you'll need more marketing effort against each one.

        If resources are limited and you don't need each country to be a fully separate entity, then you'll probably have less headaches with sub-folders. I'm speaking in broad generalities, though - this is a big decision that depends a lot on the details.

        1 Reply Last reply Reply Quote 0
        • UpMedio_SEO
          UpMedio_SEO last edited by

          Dear all,

          I have bought 30 geo top level domains. This is for an ecommerce project that has not launcehd yet (and isn't indexed by Google).

          I am now at a point where I can change/consolidate all domains as sub domains or sub folders or keep things as they are.

          I just worry that link building  would be scattered and not focused and that it might be better to concentrate the efforts on one domain.

          What are your views on this?

          Many thanks.

          1 Reply Last reply Reply Quote 0
          • Dr-Pete
            Dr-Pete Staff @HiteshBharucha last edited by

            Yeah - I'm really afraid that stacking all those sub-domains is going to cause you long-term issues with your link-building, and that some of those sub-domains could fragment. If the country needs to be in a sub-domain, then I think the hybrid approach (with "/shop" as a sub-folder) may cause you less trouble.

            I will warn, though, that any change like this carries some risk. You'll have to put proper 301-redirects in place.

            I might try the href lang tags first, though, and see if it helps the current problem (it may take a few weeks). Changing too many aspects of the on-page SEO at once could cause you a lot of grief.

            1 Reply Last reply Reply Quote 0
            • HiteshBharucha
              HiteshBharucha @HiteshBharucha last edited by

              shop. pages are simply new pages which are added for products to be sold with ease. I think that i might move shop.uk.xyz.com pages to uk.xyz.com/shop/product as in a sub folder. Do you think this will help in passing on the link juice to those pages after the change and would be easy for me to include them in the sitemap as well??

              1 Reply Last reply Reply Quote 0
              • Dr-Pete
                Dr-Pete Staff @HiteshBharucha last edited by

                If you have separate GWT profiles, then I think the XML sitemap may have to be under the sub-domain - Google has to be able to access it from a sub-domain URL. It doesn't have to be in the root of the sub-domain.

                I'm not clear on what the "shop." pages are, but stacking sub-domains like that sounds like it's getting pretty messy. Why the separation?

                1 Reply Last reply Reply Quote 0
                • HiteshBharucha
                  HiteshBharucha @HiteshBharucha last edited by

                  I have already created separate profiles for the subdomains, but my only worry is where to place the sitemap on the server eg in the root directory of the root domain or in the root directory of the sub domain.

                  Coming to the (2) the pages which i want to include in the site map are my product pages. so want to know if shop.uk.xyz.com can be included in the sitemap which will be for uk.xyz.com and also if does that count as a internal page of uk.xyz.com

                  1 Reply Last reply Reply Quote 0
                  • Dr-Pete
                    Dr-Pete Staff @HiteshBharucha last edited by

                    It is probably best to create separate profiles in Google Webmaster Tools, because then you can target the sub-domains to the countries in question. At that point, you could also set up separate sitemaps. It'll give you a cleaner view of how each sub-domain is indexed and ranking.

                    I'm not sure I understand (2) - why wouldn't you include those pages in the sitemap?

                    1 Reply Last reply Reply Quote 2
                    • HiteshBharucha
                      HiteshBharucha @Dr-Pete last edited by

                      Thank you for your inputs. I has relly helped me understand the situation.

                      I will try to implement this and let you know how I have done on this. Also I had few more things on this:

                      1. do i require a separate sitemap and robots file for all the sub domains and where shall i place it on the server?

                      2. in the sub domain there are pages like shop.uk.xyz.com/product1. so can i include that in the sitemaps as those are the pages which i really want to rank for.

                      Dr-Pete HiteshBharucha 5 Replies Last reply Reply Quote 0
                      • Dr-Pete
                        Dr-Pete Staff last edited by

                        There's no perfect answer. Canonical tags would keep the sub-domains from ranking, in many cases. The cross-TLD stuff is weird, though - Google can, in some cases, ignore the canonical if they think that one sub-domain is more appropriate for the country/ccTLD the searcher is using.

                        Sub-domains can be tricky in and of themselves, unfortunately, because they sometimes fragment and don't pass link "juice" fully to the root domain. I generally still think sub-folders are better for cases like this, but obviously that would be a big change (and potentially risky).

                        You could try the rel="alternate" hreflang tags. They're similar to canonical (a bit weaker), but basically are designed to handle the same content in different languages and regions:

                        http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077

                        They're basically designed for exactly this problem. You can set the root domain to "en-US", the UK sub-domain to "en-UK", etc. I've heard generally good things, and they're low-risk, but you have to try it and see. They can be a little tricky to implement properly.

                        HiteshBharucha 1 Reply Last reply Reply Quote 2
                        • DarinPirkey
                          DarinPirkey @HiteshBharucha last edited by

                          No, 301 and canonicals are completely different

                          A 301 will redirect a page and a canonical is setting the preferred version of the page.  For example:

                          301 - you have an old version of the page that looks like this  www.example.com/p?=153 and you want it to look like www.example.com/red-apples.  You would use a 301 from the old page (www.example.com/p?=153) to the new page (www.example.com/red-apples)

                          Canonical - Lets go back to the red apples example.  Lets say you have a ecommerce site and you have different ways to search for products.  One way is to search by fruit and the other by color.  So what you'll have is two versions of the end result.  For example.  You'll have www.example.com/fruit/red-apples and you might have www.example.com/red/red-apples.  Since both of those pages show the same information you don't want the engines to think its duplicate content so you can add a rel=canonical link element to both pages to the preferred version of the two. (ie you might want to have the canonical be www.example.com/red-apples)  That's all it does.  It tells the engines your preferred version of the pages that may be the same.

                          Back to your original post, you really don't need to "noindex" but I thought you were having a duplicate content issue and that would solve the issue. (Generally, Google won't penalize you this sort of duplicate content)

                          Here is what I would do.

                          If you don't have Google Webmaster tools already set up then do so.  Verify each version of your subdomain,  (ie.  india.xyz.com, uk.xyz.com, etc)(let me know if you need help) and then set your Geo Target for each them manually (You'll have to set this up manually because you have a gTLD and not a ccTLD)

                          How to set your Geo Target manually.

                          To to a particular version of your site in WMT (ie. india.xyz.com) and click on "configuration" then "settings".  Under "settings" the first sections says "Geographical Target".  "Check" the box and then use the drop down to select "india".

                          Repeat this for all of your subdomains for each specific country.

                          This will let Google know that you are trying to target users in a specific country.

                          If you have the money to invest in it, I would also try to have those subdomains hosted by a server in each particular country. (strong signal for Google)

                          Hope it helps.

                          1 Reply Last reply Reply Quote 0
                          • HiteshBharucha
                            HiteshBharucha @DarinPirkey last edited by

                            Thanx Darin!

                            I have few doubts on this:

                            1. is rel canonical like a 301 redirect? As my concern is if my user goes to www.uk.xyz.com/productx , will he be redirected to to www.xyz.com/product

                            2. my sub domain pages are ranking in the country specific search engine. For ex, www.uk.xyz.com is ranking for keywords in google.co.uk. So if i noindex then i will loose my search engine presence in the country specific search engine.

                            PS the content on the pages is all same apart from the product currency.

                            DarinPirkey 1 Reply Last reply Reply Quote 0
                            • DarinPirkey
                              DarinPirkey @gmk1567 last edited by

                              I disagree.  I said "noindex" not "nofollow".  Link juice will be passed but not show up in the Serps.  I do agree with you though that the strategy as a whole, if there is in-fact exact/duplicate content, seems to be a waste.  Unless these pages are in another language, I don't see the point of this subdomain strategy.

                              1 Reply Last reply Reply Quote 0
                              • gmk1567
                                gmk1567 last edited by

                                Canonical will help to remove duplicate issues and also to consolidate your link values. I didn't see any issue with cross domain implementation.

                                If you add "noindex" to any of these pages, you won't get any link credit.

                                DarinPirkey 1 Reply Last reply Reply Quote 1
                                • DarinPirkey
                                  DarinPirkey last edited by

                                  Short Answer:  Set a canonical url on the pages to the root domain version and noindex the subdomain pages.

                                  What this does is avoid the duplicate content problem.  Generally, those subdomain pages won't rank anyway because the same information is on the "main" site.  You can still build links to those subdomain pages and do a strong internal link structure to help the "main" site rankings.

                                  The only negative to this is that the pages in your subdomain won't rank.  That's not necessarily a bad thing but just know they won't.  But, if the pages are truly duplicate content, they won't rank anyway.

                                  HiteshBharucha 1 Reply Last reply Reply Quote 2
                                  • 1 / 1
                                  • First post
                                    Last post

                                  Got a burning SEO question?

                                  Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                                  Start my free trial


                                  Browse Questions

                                  Explore more categories

                                  • Moz Tools

                                    Chat with the community about the Moz tools.

                                  • SEO Tactics

                                    Discuss the SEO process with fellow marketers

                                  • Community

                                    Discuss industry events, jobs, and news!

                                  • Digital Marketing

                                    Chat about tactics outside of SEO

                                  • Research & Trends

                                    Dive into research and trends in the search industry.

                                  • Support

                                    Connect on product support and feature requests.

                                  • See all categories

                                  Related Questions

                                  • GhillC

                                    Same site serving multiple countries and duplicated content

                                    Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
                                    site.com/us/
                                    site.com/gb/
                                    site.com/fr/
                                    site.com/it/
                                    etc. The first problem was fairly easy to solve:
                                    Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
                                    Given the following requirements/constraints, I can't see any positive resolution to this issue:
                                    1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
                                    2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
                                    3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
                                    Am I right or can you think about anything to sort that out? Many thanks,
                                    Ghill

                                    Intermediate & Advanced SEO | | GhillC
                                    0
                                  • nchlondon

                                    Directory with Duplicate content? what to do?

                                    Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.

                                    Intermediate & Advanced SEO | | nchlondon
                                    0
                                  • iamgreenminded

                                    Trailing Slashes for Magento CMS pages - 2 URLS - Duplicate content

                                    Hello, Can anyone help me find a solution to Fixing and Creating Magento CMS pages to only use one URL  and not two URLS? www.domain.com/testpage www.domain.com/testpage/ I found a previous article that applies to my issue, which is using htaccess to redirect request for pages in magento 301 redirect to slash URL from the non-slash URL.  I dont understand the syntax fully in htaccess , but I used this code below. This code below fixed the CMS page redirection but caused issues on other pages, like all my categories and products with this error: "This webpage has a redirect loop ERR_TOO_MANY_REDIRECTS" Assuming you're running at domain root.  Change to working directory if needed. RewriteBase / # www check If you're running in a subdirectory, then you'll need to add that in to the redirected url (http://www.mydomain.com/subdirectory/$1 RewriteCond %{HTTP_HOST} !^www. [NC]
                                    RewriteRule ^(.*)$ http://www.mydomain.com/$1 [R=301,L] Trailing slash check Don't fix direct file links RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.)/$
                                    RewriteRule ^(.)$ $1/ [L,R=301] Finally, forward everything to your front-controller (index.php) RewriteCond %{REQUEST_FILENAME} !-f
                                    RewriteCond %{REQUEST_FILENAME} !-d
                                    RewriteRule .* index.php [QSA,L]

                                    Intermediate & Advanced SEO | | iamgreenminded
                                    0
                                  • browndoginteractive

                                    Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)

                                    Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
                                    2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality:  http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results:  Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index:  robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages.  I say "force" because of the crawl budget required.  Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links.  Best of both worlds:  crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution:  using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.

                                    Intermediate & Advanced SEO | | browndoginteractive
                                    0
                                  • team_tic

                                    International SEO - cannibalisation and duplicate content

                                    Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant  drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian

                                    Intermediate & Advanced SEO | | team_tic
                                    1
                                  • AxialDev

                                    How do I geo-target continents & avoid duplicate content?

                                    Hi everyone, We have a website which will have content tailored for a few locations: USA: www.site.com
                                    Europe EN: www.site.com/eu
                                    Canada FR: www.site.com/fr-ca Link hreflang and  the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices. What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent. Thanks!

                                    Intermediate & Advanced SEO | | AxialDev
                                    0
                                  • Creode

                                    Duplicate content on ecommerce sites

                                    duplicate content

                                    I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,

                                    Intermediate & Advanced SEO | | Creode
                                    0
                                  • WSOT

                                    How get rid of duplicate content, titles, etc on php cartweaver site?

                                    my website http://www.bartramgallery.com was created using php and cartweaver  2.0 about five years ago by a web developer.  I was really happy with the results of the design was inspired to get into web development and have been studying ever since.  My biggest problem at this time is that I am not knowledgable with php and the cartweaver product but am learning as I read more.  The issue is that seomoz tools are reporting tons of duplicate content and duplicate title pages etc.  This is likely from the dynamic urls and same pages with secondary results etc.  I just made a new sitemap with auditmypc I think it was called in an attempt to get rid of all the duplicate page titles but is that going to solve anything or do I need to find another way to configure the site?  There are many pages with the same content competing for page rank and it is a bit frustrating to say the least.  If anyone has any advice it would be greatly appreciated even pointing me in the right direction. Thank you, Jesse

                                    Intermediate & Advanced SEO | | WSOT
                                    0

                                  Get started with Moz Pro!

                                  Unlock the power of advanced SEO tools and data-driven insights.

                                  Start my free trial
                                  Products
                                  • Moz Pro
                                  • Moz Local
                                  • Moz API
                                  • Moz Data
                                  • STAT
                                  • Product Updates
                                  Moz Solutions
                                  • SMB Solutions
                                  • Agency Solutions
                                  • Enterprise Solutions
                                  Free SEO Tools
                                  • Domain Authority Checker
                                  • Link Explorer
                                  • Keyword Explorer
                                  • Competitive Research
                                  • Brand Authority Checker
                                  • Local Citation Checker
                                  • MozBar Extension
                                  • MozCast
                                  Resources
                                  • Blog
                                  • SEO Learning Center
                                  • Help Hub
                                  • Beginner's Guide to SEO
                                  • How-to Guides
                                  • Moz Academy
                                  • API Docs
                                  About Moz
                                  • About
                                  • Team
                                  • Careers
                                  • Contact
                                  Why Moz
                                  • Case Studies
                                  • Testimonials
                                  Get Involved
                                  • Become an Affiliate
                                  • MozCon
                                  • Webinars
                                  • Practical Marketer Series
                                  • MozPod
                                  Connect with us

                                  Contact the Help team

                                  Join our newsletter
                                  Moz logo
                                  © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                                  • Accessibility
                                  • Terms of Use
                                  • Privacy

                                  Looks like your connection to Moz was lost, please wait while we try to reconnect.