Solve duplicate content issues by using robots.txt
-
Hi,
I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok?
Thank for any help!
-
yes, robots.txt is the bad way, I will try to use canonical tag. Thanks for your help!
-
Using robots.txt is perhaps not the best way of doing it. Using the canonical or a noindex meta tag would likely be best. I think the reasons for this are best summed up in this article which explains, probably better than I could, why robots.txt is not the best way of dealing with duplicate content. Hope this helps.
-
I have tried to use cross-domain canonical, but this is too difficult for me. So, I want to confirm if I use the robots.txt file is ok or not?
Thanks
-
Why not use a cross-domain canonical whereby you reference the pages on your primary website as the canonical version on your secondary websites, thereby eliminating the duplication.
For example on each page that is duplicate on your secondary website you would add the following to the head to reference the primary pages:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do permanent redirect solve the issue of duplicate content?
Hi, I have a product page on my site as below. www.mysite.com/Main-category/SubCatagory/product-page.html This page was accessible in both ways as below. 1. www.mysite.com/Main-category/SubCatagory/product-page.html 2. www.mysite.com/Main-category/product-page.html This was causing duplicate title issue. So i permanently redirected one to other. But after more than a month and after many crawls, webmaster tools html improvement still shows duplicate title issue. My question is that do permanent redirect solve duplicate content issue or something i am missing here?
On-Page Optimization | | Kashif-Amin0 -
Where to add new content
I run a vBulletin website and vBulletin isnt very SEO friendly. I do fairly well in Google for most of my keywords, but forums dont necessarily build strong page authority etc. My site deals with fishing reports across the state of VA and drives 15-18k sessions a month and close to 100,000 page views a month based on Google Analytics. I want to start targeting new keywords and I am concerned about vBulletin inability to be SEO friendly. Many of my new keywords arent dynamic like fishing reports that are added by members daily. These are more like campgrounds, marinas etc. My thought is to install a Wordpress blog and build out this content so I can efficiently deal with on page SEO. the vBulletin software is installed in the root so I would install wordpress in something like mydomain/lake123/ Is the right thing to do, and will google see multiple sitemaps (one for vbulletin and another for wordpress) and index appropriately? Am I missing something major here? Thanks ~ Brian
On-Page Optimization | | FCBCO0 -
Duplicate title tags, how to solve that?
We are currently running the "yellow pages". The problem is that Google Webmasters reports a lot of duplicate title tags. It's because we have three languages and the title consists of company name. for example: FCR Media Lietuva, UAB (The same in all languages). Of course we make different meta desriptions and so on. How should we solve this problem or should be just leave it as it is?
On-Page Optimization | | FCRMediaLietuva0 -
Duplicate title tag
Hello,
On-Page Optimization | | JohnHuynh
My site have problems with duplicate title, they were reported from google webmaster. For example:
/extra-services/car-pick-up-service-146.html (1) /extra-services/transportation--car-rails--146.html (2) According to my sitemap the first URL is right (1). But the second URL is wrong, I don't know it occur here.0 -
Duplicate content harms individual pages or whole site?
Hi, One section of my site is a selection of Art and Design books. I have about 200 individual posts, each with a book image and a description retrieved from Amazon (using their API). Due to several reasons not worth mentioning I decided to use the Amazon description. I don't mind if those pages rank well or not, but I need them as additional content for my visitors as they browse my site. The value relies in the selection of books. My question is if the duplicate content taken from Amazon harms only each book page or the whole site. The rest of the site has unique content. Thanks! Enrique
On-Page Optimization | | enriquef0 -
Duplicate Page Content Question
This article was published on fastcompany.com on March 19th. http://www.fastcompany.com/magazine/164/designing-facebook It did not receive much traffic, so it was re-posted on Co.Design today (March 27th) where it has received significantly more traffic. http://www.fastcodesign.com/1669366/facebook-agrees-the-secret-to-its-future-success-is-design My question is if google will dock us for reprinting/reusing content on another site (even if it is a sister site within the same company). If they do frown on that, is there a proper way to attribute the content to the source material/site (fastcompany.com)?
On-Page Optimization | | DanAsadorian0 -
What is the best duplicate content checker that will check by phrase?
I create a lot of landing pages for individual keywords on my site. An issue that I've run into is I unknowingly use some common phrases repeatedly on different pages and therefore sometimes get dinged by Google. I'm basically looking for a tool that would check the content of a new page against all the other pages on my site and check it phrase by phrase. Most of the tools I've found make you put in two URLs to check against - I need it to check against hundreds.
On-Page Optimization | | davegr0 -
Best practice for franchise sites with duplicated content
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues. All sites are hosted on the same server therefor the same IP address All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate Almost all sites have the same design (A few of the groups we work with have multiple design options) Any suggestions would be greatly appreciated. Thanks Again Aaron
On-Page Optimization | | Shipyard_Agency0