Solve duplicate content issues by using robots.txt
-
Hi,
I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok?
Thank for any help!
-
yes, robots.txt is the bad way, I will try to use canonical tag. Thanks for your help!
-
Using robots.txt is perhaps not the best way of doing it. Using the canonical or a noindex meta tag would likely be best. I think the reasons for this are best summed up in this article which explains, probably better than I could, why robots.txt is not the best way of dealing with duplicate content. Hope this helps.
-
I have tried to use cross-domain canonical, but this is too difficult for me. So, I want to confirm if I use the robots.txt file is ok or not?
Thanks
-
Why not use a cross-domain canonical whereby you reference the pages on your primary website as the canonical version on your secondary websites, thereby eliminating the duplication.
For example on each page that is duplicate on your secondary website you would add the following to the head to reference the primary pages:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Contents in Order Pages of Multiple Products
Hi, I have a website containing 30 software products. Each product has an order page. The problem is that the layout and content of these 30 order pages are very similar, except for the product name, for example: https://www.datanumen.com/access-repair-order/
On-Page Optimization | | ccw
https://www.datanumen.com/outlook-repair-order/
https://www.datanumen.com/word-repair-order/ Siteliner has reports these pages as duplicate contents. I am thinking of noindex these pages. However, in such a case, if a user search for "DataNumen Outlook Repair order page", then he will not be able to see the order page of our product, which drives the revenue go away. So, how to deal with such a case? Thank you.1 -
To avoid the duplicate content issue I have created new urls for that specific site I am posting to and redirecting that url to the original on my site. Is this the right way to do it?
I am trying to avoid the duplicate content issue by creating new urls and redirecting them to the original url. Is this the proper way of going about it?
On-Page Optimization | | yagobi210 -
Content Mismatch
Hi, I've added my app to search console, and there are reported 480 content mismatch pages. How can I solve this problem?
On-Page Optimization | | Silviu0 -
Do permanent redirect solve the issue of duplicate content?
Hi, I have a product page on my site as below. www.mysite.com/Main-category/SubCatagory/product-page.html This page was accessible in both ways as below. 1. www.mysite.com/Main-category/SubCatagory/product-page.html 2. www.mysite.com/Main-category/product-page.html This was causing duplicate title issue. So i permanently redirected one to other. But after more than a month and after many crawls, webmaster tools html improvement still shows duplicate title issue. My question is that do permanent redirect solve duplicate content issue or something i am missing here?
On-Page Optimization | | Kashif-Amin0 -
Can we listed URL on Website sitemap page which are blocked by Robots.txt
Hi, I need your help here. I have a website, and few pages are created for country specific. (www.example.com/uk). I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page) I really appreciate your help. Thanks, Nilay
On-Page Optimization | | Internet-Marketing-Profs0 -
Meta Descriptions - Duplicate Content?
I have created a Meta Description for a page that is optimized for SERPS. If I also put this exact content on my page for my readers, would this be considered duplicate content? The meta description and content will be listed on the same page with the same URL. Thanks for your help.
On-Page Optimization | | tuckjames0 -
Duplicate product urls
Our site automatically creates shorter urls for the products. There is a rel canonical tag in place, but webmaster tools shows these urls have duplicate title tags. Here is an example: http://www.colemanfurniture.com/holden-desk.htm http://www.colemanfurniture.com/writing-desks-secretary-desks/holden-desk.htm Should the longer url be redirected to the shorter one?
On-Page Optimization | | thappe0 -
Why does SEOmoz use /blog/content-title vs /category/content-title? Any difference?
Assume a brand new blog being designed and all other things equal. What are the pros & cons between using the url structure /blog/content-title vs. /category/content-title? Note:
On-Page Optimization | | JasonJackson
Both scenarios would be using categorical archiving.0