Solve duplicate content issues by using robots.txt
-
Hi,
I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok?
Thank for any help!
-
yes, robots.txt is the bad way, I will try to use canonical tag. Thanks for your help!
-
Using robots.txt is perhaps not the best way of doing it. Using the canonical or a noindex meta tag would likely be best. I think the reasons for this are best summed up in this article which explains, probably better than I could, why robots.txt is not the best way of dealing with duplicate content. Hope this helps.
-
I have tried to use cross-domain canonical, but this is too difficult for me. So, I want to confirm if I use the robots.txt file is ok or not?
Thanks
-
Why not use a cross-domain canonical whereby you reference the pages on your primary website as the canonical version on your secondary websites, thereby eliminating the duplication.
For example on each page that is duplicate on your secondary website you would add the following to the head to reference the primary pages:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog on server or embedded? Duplicate content?
Wondering what would be best in terms of SEO. Should I install some blog software actually on the website or can I just embed say a blogger.com blog? if I did that would they consider it duplicate content?
On-Page Optimization | | Superflys0 -
My competitors are using blackhat. What should i do.?
My competitors are using on page black hat methods They are using like keyword stuffing What should i do.?
On-Page Optimization | | aman1231 -
Is it better to create more pages of content or expand on current pages of content?
I am assuming that one way of improving the rankings of current pages will be to create more content on the keywords used... should this be an expansion of the content on current pages I am optimising for a keyword or is it better to keep creating new pages and if we are creating new pages is it best to use an extension of the keyword on the new page – for example if we are optimising one page for ‘does voltage optimisation work’ would it then be worth creating a page optimised for ‘does voltage optimisation work in hotels’ for example and so on? I am guessing maybe both might help, this is just a question I have had from one of my clients.
On-Page Optimization | | TWSI1 -
Duplicate Content - Blog Rewriting
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website. He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field. To what extent would I need to rewrite each article so as to avoid duplicating the content? Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords? Would the articles need to be completely taken by all current publishers? Any advice would be greatly appreciated.
On-Page Optimization | | StoryScout0 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
How do you avoid duplicate content when you sell products that are produced by other manufacturers?
I have a packaging product site, and they sell products from various manufacturers. What can we do with the product detail pages? As of now, the client has copy pasted content straight from the "About" sections on the manufacturers' sites. Obviously, those manufacturers want my client to sell their products, and the products need to be described. How much of a no-no is this copy pasting, and how can I fix it?
On-Page Optimization | | lhc670 -
Duplicate Content
We offer Wellness programs for dogs and cats. A lot of the information is the same except for specifics that relate to young vs. senior pets. I have these different pages: Senior Wellness Kitten Wellness Puppy Wellness Adult Wellness Can each page have approx. 75% of the same text? Or should I rewrite each page so the information (though the same) appears unique.
On-Page Optimization | | PMC-3120870 -
DUPLICATE PAGE TITLE ISSUE
Hi We have 25 pages with a download form on it. People arrive at the page through a ink with optimised anchor text which sits on the information pages. As there is no information on these pages we do not need them to be optimised so the developer has given all the download pages exactly the same page title. Although the pages in themselves are not significant would this effect the way Google viewed the whole site, and would it pay to make each one unique or doesn't it really matter. Alternatively, is there a better way to handle this? and if so would that ligate the benefit of the anchor text. Thanks
On-Page Optimization | | PH2920