Issue with duplicate content in blog
-
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same.
For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages.
if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
-
You should use "noindex, follow", not "noindex, nofollow". In my opinion, you don't need to use "nofollow" at all here.
Also, some people say it is OK to index your category pages, but if you do there should be some unique content describing what the category is about.
-
if i have more no - follow meta tags in my blog does it gives negative impact to search engines.
My tags and Category url already get indexed i will add no follow tags on those pages. Shall i use google page removal tool to remove those category and tag pages.
Also in my xml sitemap these category and tag pages are mention, so i have to remove these category and tag pages in my xml sitemap.
-
If you're using Wordpress, you can use Yoast's Wordpress SEO plugin to set your tag and/or category pages to "noindex, follow". This will tell search engines not to index those pages (which will take care of the duplicate content), but will also allow your links to continue to be crawled.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve this issue and avoid duplicated content?
My marketing team would like to serve up 3 pages of similar content; www.example.com/one, www.example.com/two and www.example.com/three; however the challenge here is, they'd like to have only one page whith three different titles and images based on the user's entry point (one, two, or three). To avoid duplicated pages, how would suggest this best be handled?
Intermediate & Advanced SEO | | JoelHer0 -
Duplicate URL Parameters for Blog Articles
Hi there, I'm working on a site which is using parameter URLs for category pages that list blog articles. The content on these pages constantly change as new posts are frequently added, the category maybe for 'Heath Articles' and list 10 blog posts (snippets from the blog). The URL could appear like so with filtering: www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general&year=2016 www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general&year=2016&page=1 All pages currently have the same Meta title and descriptions due to limitations with the CMS, they are also not in our xml sitemap I don't believe we should be focusing on ranking for these pages as the content on here are from blog posts (which we do want to rank for on the individual post) but there are 3000 duplicates and they need to be fixed. Below are the options we have so far: Canonical URLs Have all parameter pages within the category canonicalize to www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general and generate dynamic page titles (I know its a good idea to use parameter pages in canonical URLs). WMT Parameter tool Tell Google all extra parameter tags belong to the main pages (e.g. www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general&year=2016&page=3 belongs to www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general). Noindex Remove all the blog category pages, I don't know how Google would react if we were to remove 3000 pages from our index (we have roughly 1700 unique pages) We are very limited with what we can do to these pages, if anyone has any feedback suggestions it would be much appreciated. Thanks!
Intermediate & Advanced SEO | | Xtend-Life0 -
Blog Content In different language not indexed - HELP PLEASE!
I have an ecommerce site in English and a blog that is in Malay language. We have started the blog 3 weeks ago with about 20-30 articles written. Ecommerce is using MAgento CMS and Blog is wordpress. URL Structure: Ecommerce: www.example.com Blog: www.example.com/blog Blog category: www.example.com/blog/category/ However, google is indexing all pages including blog category but not individual post that is in Malay language. What could be the issue here? PLEASE help me!
Intermediate & Advanced SEO | | WayneRooney0 -
Best way to remove duplicate content with categories?
I have duplicate content for all of the products I sell on my website due to categories and subcategories. Ex: http://www.shopgearinc.com/products/product/stockfeeder-af38.php http://www.shopgearinc.com/products/co-matic-power-feeders/stockfeeder-af38.php http://www.shopgearinc.com/products/co-matic-power-feeders/heavy-duty-feeders/stockfeeder-af38.php Above are 3 urls to the same title and content. I use a third party developer backend system so doing canonicalization seems difficult as I don't have full access. What is the best to get rid of this duplicate content. Can I do it through webmaster tools or should I pay the developer to do the canonicalization or a 301 redirect? Any suggestions? Thanks
Intermediate & Advanced SEO | | kysizzle60 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
Dynamic 301's causing duplicate content
Hi, wonder if anyone can help? We have just changed our site which was hosted on IIS and the page url's were like this ( example.co.uk/Default.aspx?pagename=About-Us ). The new page url is example.co.uk/About-Us/ and is using Apache. The 301's our developer told us to use was in this format: RewriteCond %{REQUEST_URI} ^/Default.aspx$
Intermediate & Advanced SEO | | GoGroup51
RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L] This seemed to work from a 301 point of view; however it also seemed to allow both of the below URL's to give the same page! example.co.uk/About-Us/?pagename=About-Us example.co.uk/About-Us/ Webmaster Tools has now picked up on this and is seeing it a duplicate content. Can anyone help why it would be doing this please. I'm not totally clued up and our host/ developer cant understand it too. Many Thanks0 -
Duplicate content even with 301 redirects
I know this isn't a developer forum but I figure someone will know the answer to this. My site is http://www.stadriemblems.com and I have a 301 redirect in my .htaccess file to redirect all non-www to www and it works great. But SEOmoz seems to think this doesn't apply to my blog, which is located at http://www.stadriemblems.com/blog It doesn't seem to make sense that I'd need to place code in every .htaccess file of every sub-folder. If I do, what code can I use? The weirdest part about this is that the redirecting works just fine; it's just SEOmoz's crawler that doesn't seem to be with the program here. Does this happen to you?
Intermediate & Advanced SEO | | UnderRugSwept0 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0