Solve duplicate content issues by using robots.txt
-
Hi,
I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok?
Thank for any help!
-
yes, robots.txt is the bad way, I will try to use canonical tag. Thanks for your help!
-
Using robots.txt is perhaps not the best way of doing it. Using the canonical or a noindex meta tag would likely be best. I think the reasons for this are best summed up in this article which explains, probably better than I could, why robots.txt is not the best way of dealing with duplicate content. Hope this helps.
-
I have tried to use cross-domain canonical, but this is too difficult for me. So, I want to confirm if I use the robots.txt file is ok or not?
Thanks
-
Why not use a cross-domain canonical whereby you reference the pages on your primary website as the canonical version on your secondary websites, thereby eliminating the duplication.
For example on each page that is duplicate on your secondary website you would add the following to the head to reference the primary pages:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix thin content issue?
Hello! I've checked my website via Moz and received "thin content" issue: "Your page is considered to have "thin content" if it has less than 50 words" But I definitely know that we have 5 text blocks with unique content, each block consist of more than 50 words. Do you have any ideas what may cause this issue? Thanks in advance, Yana
On-Page Optimization | | yanamazault0 -
Content for the Home Page
Hi All, I have a Videos website which contains Videos of all types + Family safe type... The home page has sections and Videos listed. Now for SEO purpose i need to have content? this is what i read in most places. What is the kind of content i can place on a Videos website Home page? I can write about a Movie or actor but that content on Home page would that be of any use? We have a About us page etc to know who we are.. Any ideas please..
On-Page Optimization | | Nettv0 -
Duplicate Content, http vs https
Hi All! I just discovered that a client of ours a duplicate content issue. Essentially they have approximately 20 pages that have an http and an https version. Is there a better way to handle this than a simple 301? Regards, Frank
On-Page Optimization | | FrankSweeney0 -
Duplicate content affects on overall rankings
Hi guys, I have a website that has 23 pages with duplicate content. These pages serve the same function, which enables customers to upload their images. There is not much content on each one but we require a different page for each of our products, here is an example page: http://www.point101.com/giclee_printing/upload#/upload I don't think it makes sense to use a canonical tag as each page is for a different product and I think its going to be difficult to differentiate each page. I was wondering: 1. If this has a negative effect on the ranking of our homepage and other main product pages or if its an issue we do not need to worry too much about. 2. If anyone has any other ideas as to how we can resolve this issue. Thanks,
On-Page Optimization | | KerryK
Kerry0 -
I'm looking to put a quite length FAQs tab on product pages on an ecommerce site. Am I likely to have duplicate content issues?
On an ecommerce site we have unique content on the product pages (i.e. descriptions), as well as the usual delivery and returns tabs for customer convenience. From this we haven't had any duplicate content issues or warnings, which seems to be the case industry-wide. However, we're looking to add a more lengthy FAQs tab which is still highly relevant to the customer but contains a lot more text than the other tabs. The product descriptions are also relatively small. Do you think this will cause potential duplicate content issues or should it be treated the same as a delivery tab, for instance?
On-Page Optimization | | creativemay0 -
Duplicate content "/"
Hi all, Ran my website through the SEOMOZ campaigns and the crawl diagnostics give me a duplicate error for these urls http://www.mysite.com/cat1/article http://www.mysite.com/cat1/article/ so the url with the "/" is a duplicate of the one without the "/" Can someone point me out to a solution to solve this ? regards, Frederik
On-Page Optimization | | frdrik1230 -
Can I use the first sentence of my page content as a meta description tag as well?
I just want to copy my content on the page and use the first or as well the second sentence of the content self for my meta description tag. Is that OK? Or should the Meta description tag be different?
On-Page Optimization | | paulinap19830 -
Crawl Diagnostics - Duplicate Content and Duplicate Page Title Errors
I am getting a lot of duplicate content and duplicate page title errors from my crawl analysis. I using volusion and it looks like the photo gallery is causing the duplicate content errors. both are sitting at 231, this shows I have done something wrong... Example URL: Duplicate Page Content http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Duplicate Page Title http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Would anyone know how to properly disallow this? Would this be as simple as a robots.txt entry or something a little more involved within volusion? Any help is appreicated. Cheers Geoff B. (a.k.a) newbie.
On-Page Optimization | | GeoffBatterham0