Can duplicate content issues be solved with a noindex robot metatag?
-
Hi all
I have a number of duplicate content issues arising from a recent crawl diagnostics report.
Would using a robots meta tag (like below) on the pages I don't necessarily mind not being indexed be an effective way to solve the problem?
Thanks for any / all replies
-
Thanx!
-
This is an old question... And the answer is yes In fact a page blocked in a robots.txt can be reindexed if that same page is linked in an external site. check this old webmaster help thread > http://www.google.com/support/forum/p/Webmasters/thread?tid=3747447eb512f886&hl=en That is why is always better use the meta robots no index to be really sure we don't want a page to be indexed
-
Yes it
would, but i would rather use the canonical tag, all pages have pagerank and
even weak pages help you site rank better. Google once released their page
rank, since then they have changed it many times, but from testing we know that
the main idea still holds true. Pages not in the index can not add to your
sites pagerank.Take a
look at this page it explains it very well. http://www.webworkshop.net/pagerank.htmlUse the calculator,
it is very intuitive -
Using a noindex meta tag is one way to resolve duplicate content issues. If you take this approach, it is most likely you wish to use only the "noindex" tag and not the "nofollow" tag. You don't want to prevent Google from following the links on the page, but instead simply stop the content from being viewed as duplicate.
If you wish to explicitly include the "follow" you can but it is unnecessary since it is the default setting.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Issue
Hello Mozzers! We have a client going through a website revamp. The client is The Michelangelo Hotel, and they are part of Star Hotels. Star Hotels plans to create a section on their site for The Michelangelo, as opposed to maintaining a stand alone site. They will then take the michelangelohotel.com domain, and point it to the corresponding pages on the Star site. The guest will key in www.michelangelohotel.com, and will see the same content that can be found on www.starhotel.com/en/michelangelo-hotel-new-york. The problem we have is this: Essentially the same content will be indexed twice, once on starhotels.com and once on michelangelohotel.com. This would seem to cause a duplicate content issue. What are your thoughts? Edit: I apologize, because I was not nearly clear enough here. The Star Hotels site will have 5 pages dedicated to The Michelangelo Hotel. The content will sit solely on that server as those 5 pages. Those 5 pages will each be indexed as 2 URLs. www.michelangelohotel.com <-> www.starhotels.com/en/michelangelo/ www.michelangelohotel.com/accommodations <-> www.starhotels.com/en/michelangelo/accommodations And so on. Thanks!
On-Page Optimization | | FrankSweeney0 -
Duplicate Content aka 301 redirect from .com to .com/index.html
Moz reports are telling me that I have duplicate content on the home page because .com and .com/index.html are being seen as two pages. I have implemented 301 redirect using various codes I found online, but nothing seems to work. Currently I'm using this code. RewriteEngine On
On-Page Optimization | | omakad
RewriteBase /
RewriteCond %{HTTP_HOST} ^jacksonvilleacservice.com
RewriteRule ^index.html$ http://www.jacksonvilleacservice.com/ [L,R=301] Nothing is changing. What am I doing wrong? I have given it several weeks but report stays the same. Also according to webmasters tools they can't see this as duplicate content. What am I doing wrong?0 -
Events in Wordpress Creating Duplicate Content Canonical Issues
Hi, I have a site which uses Event Manager Pro within Wordpress to create Events (as custom post types on my blog. I use it to advertise cookery classes. In a given month I might run one type of class 4 times. The event page I have made for each class is the same and I duplicate it 4 times and just change the dates to promote it. The problem is with over 10 different classes, which are then duplicated up to 4 times each per month. I get loads of duplicate content errors. How can I fix this without redirecting people away from the correct page for the date they are interested in? Is it best just to use a no follow for ALL events and rely on the other parts of my site for SEO? Thanks, T23
On-Page Optimization | | tekton230 -
Description tag/ duplicate content.
Quick question - will Gg deem it duplicate content if I use the description tag text anywhere else in the on-page copy? Thanks, David
On-Page Optimization | | newstd1000 -
How can I make it so that the various iterations (pages) do not come up as duplicate content ?
Hello, I wondered if somebody could give me some advice. The problem of various iterations of the clanedar page coming up as duplicate content. There is a large calendar on my site for events and each time the page is viewed it is seen as duplicate content . How can I make it so that the various iterations (pages) do not come up as duplicate content ? Regards
On-Page Optimization | | Tony14Aug0 -
How Should I Fix Duplicate Content in Wordpress Pages
In GWMT i see google found 41 duplicate content in my wordpress blog. I am using Yoast SEO plugin to avoid those type of duplicates but still the problem was stick.. You can check the screenshot here - http://prntscr.com/dxfjq Please help..
On-Page Optimization | | mamuti0 -
Meta Data definition for multiple pages. Potential duplicate content risk?
Hi all, One of our clients needs to redefine their meta title and description tags. They publish very similar information almost every day, so the structure they propose is the following: Structure 1: Type of Analysis + periodicity + data + brand name Examples 1: Monthly Market Analysis, 1/5/2012 - Brand Name Weekly Technical Analysis, 7/5/2012 - Brand Name Structure 2: Company Name + investment recommendation + periodicity Example 2: Iberdrola + investment recommendation (this text doesn't vary) + 2T12 (wich means 2012, 2nd trimestrer) Regarding meta description they want to follow a similar approach, replicating every time the same info with a slight variation for each publication. I'm afraid this may cause a duplicate content problem because of the resemblance of every "Market Analysis" done or every "Investment recommendation" done in the future. My initial suggestion for them is to define specific and unique meta data for each page, but this is not possible for them given the time it takes to do it for every page. Finally, I ask them to specify the data in each meta title of content published, in order to add something different each time and avoid duplicate content penalty. Will this be enough to avoid duplicate content issues? Thanks in advance for your help folks! Alex
On-Page Optimization | | elisainteractive0 -
How to avoid duplicate content on ecommerce pages?
I am currently building the site architecture for a very large ecommerce site. I am wondering how I should build it out if I have products that I want to include in multiple categories within my site. For example: Lets say I sell fitness equipment and I have categories for things such as: Treadmill, Exercise Bike, Stair Stepper, Weight Benches etc. But then I also have specific brand category pages such a: Precor, Life Fitness, Hammer, Body Solid So my question is how do I structure this so I am building this correctly? If I sell a Precor Treadmill I will want to include that product under the "Treadmill" category page as well as under the "Precor Equipment" category page. Can I get some advice for the best way to structure this? It's obviously something I want to avoid at all costs of doing improperly and having to fix later. Thank you Jake
On-Page Optimization | | PEnterprises0