Using robots.txt to deal with duplicate content
-
I have 2 sites with duplicate content issues.
One is a wordpress blog.
The other is a store (Pinnacle Cart).
I cannot edit the canonical tag on either site. In this case, should I use robots.txt to eliminate the duplicate content?
-
It will be any part of the URL that doesn't handle navigation, so look at what you can delete off the URL without breaking the link to the product page.
Take a look at this: http://googlewebmastercentral.blogspot.com/2009/10/new-parameter-handling-tool-helps-with.html
Remember, this will only work with Google!
This is another interesting video from Matt Cutts about removing content from Google: http://googlewebmastercentral.blogspot.com/2008/01/remove-your-content-from-google.html
-
If the urls look like this...
Would I tell Google to ignore p, mode, parent, or CatalogSetSortBy? Just one of those or all of those?
Thanks!!!
-
For Wordpress try : http://wordpress.org/extend/plugins/canonical/
also look at Yoast's Wordpress SEO plugin referenced on that page - I love it!
and for the duplicate content caused by the dymanic content on the pinnacle cart you can use the Google Webmasters tool to tell the Google to ignore certain parameters - go to Site configuration - Settings - Parameter handling and add the variables you wish to ignore to this list.
-
Hi,
The two sites are unrelated to each other so my concern is not duplicate content between the two, there is none.
However, on each of the sites I have the duplicate content issues. I do have admin privileges to both sites.
If there is a Wordpress plug in that would be great. Do you have one that you would recommend?
For my ecommerce site using pinnacle cart, I have duplicates because of the way people can search on the site. For example:
|
http://www.domain.com/accessories/
http://www.domain.com/accessories/?p=catalog&mode=catalog&parent=17&pg=1&CatalogSetSortBy=date
http://www.domain.com/accessories/?p=catalog&mode=catalog&parent=17&pg=1&CatalogSetSortBy=name
http://www.domain.com/accessories/?p=catalog&mode=catalog&parent=17&pg=1&CatalogSetSortBy=price
|
These all show as duplicate content in my webmaster tools reports. I don't have the ability to edit each head tag of pages in order to add a canonical link on this site.
-
What are your intentions here? Do you intend to leave both sites running? Can you give us more information on the sites? Are they aged domains, is one/any/both of them currently attracting any inbound links, are they ranking? What is the purpose of the duplicate content?
Are you looking to redirect traffic from one of the sites to the other using 301 redirect?
Or do you want both sites visible - using the Canonical link tag?
(I am concerned that you say you 'cannot edit the tag'? Do you not have full Admin access to either site?
There are dedicated Canonical management plugins for Wordpress (if you have access to the wp-admin area)
You are going to need some admin priviledges to make any alterations to the site so that you can correct this.
Let us know a bit more please!
These articles may be useful as they provide detailed best practice info on redirects:
http://www.google.com/support/webmasters/bin/answer.py?answer=66359
http://www.seomoz.org/blog/duplicate-content-block-redirect-or-canonical
Check out this article on redirects
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content from long Site Title
Hello! I have a number of "Duplicate Title Errors" as my website has a long Site Title: Planit NZ: New Zealand Tours, Bus Passes & Travel Planning. Am I better off with a short title that is simply my website/business name: Planit NZ My thought was adding some keywords might help with my rankings. Thanks Matt
Technical SEO | | mkyhnn0 -
Duplicate content
Hello mozzers, I have an unusual question. I've created a page that I am fully aware that it is near 100% duplicate content. It quotes the law, so it's not changeable. The page is very linkable in my niche. Is there a way I can build quality links to it that benefit my overall websites DA (i'm not bothered about the linkable page being ranked) without risking panda/dupe content issues? Thanks, Peter
Technical SEO | | peterm21 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Do I need robots.txt and meta robots?
If I can manage to tell crawlers what I do and don't want them to crawl for my whole site via my robots.txt file, do I still need meta robots instructions?
Technical SEO | | Nola5040 -
Are recipes excluded from duplicate content?
Does anyone know how recipes are treated by search engines? For example, I know press releases are expected to have lots of duplicates out there so they aren't penalized. Does anyone know if recipes are treated the same way. For example, if you Google "three cheese beef pasta shells" you get the first two results with identical content.
Technical SEO | | RiseSEO0 -
Duplicate Content within Website - problem?
Hello everyone, I am currently working on a big site which sells thousands of widgets. However each widget has ten sub widgets (1,2,3... say) My strategy with this site is to target the long tail search so I'm creating static pages for each possibly variation. So I'll have a main product page on widgets in general, and also a page on widget1, page on widget2 etc etc. I'm anticipating that because there's so much competition for searches relating to widgets in general, I'll get most of my traffic from people being more specific and searching for widget1 or widget 7 etc. Now here's the problem - I am getting a lot of content written for this website - a few hundred words for each widget. However I can't go to the extreme of writing unique content for each sub widget - that would mean 10's of 1,000's of articles. So... what do I do with the content. Put it on the main widget page was the plan but what do I do about the sub pages. I could put it there and it would make perfect sense to a reader and be relevant to people specifically looking for widget1, say, but could there be a issue with it being viewed as duplicate content. One idea was to just put a snippet (first 100 words) on each sub page with a link back to the main widget page where the full copy would be. Not sure whether I've made myself clear at all but hopefully I have - or I can clarify. Thanks so much in advance David
Technical SEO | | OzDave0 -
Crawl Errors and Duplicate Content
SEOmoz's crawl tool is telling me that I have duplicate content at "www.mydomain.com/pricing" and at "www.mydomain.com/pricing.aspx". Do you think this is just a glitch in the crawl tool (because obviously these two URL's are the same page rather than two separate ones) or do you think this is actually an error I need to worry about? Is so, how do I fix it?
Technical SEO | | MyNet0