Should I noindex my categories?
-
Hello! I have created a directory website with a pretty active blog. I probably messed this up, but I pretty much have categories (for my blog) and custom taxonomy (for different categories of services) that are very similar. For example I have the blog category "anxiety therapists" and the custom taxonomy "anxiety".
1- is this a problem for google? Can it tell the difference between archive pages in these different categories even though the names are similar?
2- should I noindex my blog categories since the main purpose of my site is to help people find therapists ie my custom taxonomy?
-
Kind of exiting though. Everytime google picks up on a couple of URLs my rankings shoot up. Its exciting to see ^_^
-
That was part of my apprehension about deindexing my blog categories. They are ranking right now.....but I also pulled a dumb move and set all of my listing categories as noindex in Yoast a couple of months ago. Fixed this a month ago but still waiting on google to pick up on it. That's part of why I'm not sure about all of this. Not sure if things will change when google starts noticing my listing categories.
-
"insead of having "/anxiety" and also "/anxiety-counseling" on the same level, why not have "/conditions/anxiety" and also "/practitioners/anxiety" as well? That way the URLs are different but there's also a hierarchical structure which helps Google to work out which is which"
I've currently got it set up so that blog posts are
/category/anxiety
and listings are under
/listing-category/anxietyWould you say this is sufficient to indicate to google that these two are different?
-
If you have get organic traffic on categories you can index them. İf you dont get any traffic with categories on Serp dont use.
-
It's unlikely that if two pages are both very useful for a query, that Google would de-list one purely because it's from the same domain. If neither page is very high value in terms of content or popularity, what you are suggesting can happen. But instead of taking the 'easy' way out and de-indexing one, your end goal should be to make every page as useful as possible!
You will rarely ever benefit in the SERPs by doing a 'quick easy thing' which adds no value to your site, pages or the wider web. Always ask how you could be informing, educating or entertaining the web in a fresh new way which hasn't previously been done. If you're doing what has been done before, you need to do it at least 3-4x better to steal that audience and exceed the historic popularity of other information sources
If your categories really all are on the same level you might want to address that by having architectural (URL) layers to distinguish the categories. Whenever you say to yourself "I can't do better", that is a big problem as not all of your competitors will share that some mindset. Do you want to be the one who gets ahead? Then you need to push on!
insead of having "/anxiety" and also "/anxiety-counseling" on the same level, why not have "/conditions/anxiety" and also "/practitioners/anxiety" as well? That way the URLs are different but there's also a hierarchical structure which helps Google to work out which is which
I think you're right that your blogs may contain content that is more relevant to the queries which you have specified. That being said, de-indexing them doesn't magically make your commercial pages more relevant. It's not necessarily going to make your commercial pages rank better, or at all. As such - maybe doing heavier CRO on the non-commercial pages would be the most advisable solution!
If you ever find yourself thinking "aha I can do this quick clever thing to make Google do what I want instead of putting their users first" it's almost certainly the wrong tactic
-
Awesome! thank you for your response ^_^. I'm not so concerned about getting one to rank over the other as much as I'm concerned that having one will cause the other not to rank at all or be significantly dampened.
2 problems lol
-
I really couldn't come up with a good category structure, so I have 30-40 categories all on the same level. Its a therapist directory so all of the categories in question are pretty much diagnoses/therapeutic issues. I don't think I could create any better hierarchy....is that really bad?
-
I did something weird :-p. my blog categories are pretty much duplicates of my custom taxonomy but with "therapy" or "counseling" tagged on the end....I think it would be better to have my custom taxonomy set up this way because its about therapists and counselors, whereas my blog is about the subject in question....but my theme is set up in a way that would have made that look bad, resulting in long lists like this:
anxiety counseling
depression counseling
couples counseling
etc.
Do you think this is a problem? Should I go through all of the coding work to change it or would something like this make little difference to google? Ie if someone searches for anxiety therapy would the blog archive "anxiety therapy" be more likely to come up than the archive of actual therapists who work with anxiety called "anxiety" because the names suggest the blogs are more relevant to the search query?
-
-
This is an interesting question and I can see why, with many modern agencies focusing on 'keyword cannibalisation' you would consider this action. What you have to realise is that Google still largely sees the web as a mass of interconnected pages. If your blog categories supply decent enough content to rank for those related terms, there's no guarantee that if you turn them off - Google will make the same evaluation of your business-aimed (service-level) categories instead
That being the case, I'd actually let time and data lead the way. In Google Analytics you will probably find that some service-level categories gain more traffic, whilst for some categories their contextual blog iterations bring in more
You might consider learning more about CRO (Conversion Rate Optimisation). In my opinion, there's rarely a time where turning traffic off is beneficial. But could those blog category URLs be re-designed to point users more easily (and more often) to their commercial counterparts? Probably
I do tend to no-index 'tag' URLs as they are messy and non-hierarchical, they can fudge up your equity flow from A to B. But actual categories with a hierarchical structure? Those are pages which you do want to rank
You might also consider whether there's some clever way to just have one category which lists posts and also commercial offerings on a given thematic basis. Really, architectural unification should be your end goal!
Remember: there's absolutely no guarantee that de-listing one category type would cause the other to rank. They're very different pages with contextually different content. Keep an eye on both and strategize to one day, eventually bring them together. That's what I would do!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
301 from old site to new one , Should I point to home page or sub category page ?
Hey Seo Experts, I have a small website ranking for few terms like cabinets sale, buy etc . However what i have now decided is to launch a New website with more different products like living room furniture, wardrobes etc . Out of all these categories on new website Cabinets is one of the SubCategory . Now I do not want to have 2 websites . So wanted to 301 from small cabinets website to newly created website. Some of the doubts I have at the moment is ? 1 Should I REDIRECT 301 to sub category (i,e cabinets) which is purely related to Cabinets or Do a Redirect to HOME PAGE . As I also need more Authority to home page as well , as this is relatively new website ? 2 Second question related to this. If you have multiple sub domains does it divide the total authority & TF.Or it is just Ok to have multiple Sub domains if needed ? Any advice appreciated !! Thanks .
Intermediate & Advanced SEO | | aus00070 -
Meta NoIndex tag and Robots Disallow
Hi all, I hope you can spend some time to answer my first of a few questions 🙂 We are running a Magento site - layered/faceted navigation nightmare has created thousands of duplicate URLS! Anyway, during my process to tackle the issue, I disallowed in Robots.txt anything in the querystring that was not a p (allowed this for pagination). After checking some pages in Google, I did a site:www.mydomain.com/specificpage.html and a few duplicates came up along with the original with
Intermediate & Advanced SEO | | bjs2010
"There is no information about this page because it is blocked by robots.txt" So I had added in Meta Noindex, follow on all these duplicates also but I guess it wasnt being read because of Robots.txt. So coming to my question. Did robots.txt block access to these pages? If so, were these already in the index and after disallowing it with robots, Googlebot could not read Meta No index? Does Meta Noindex Follow on pages actually help Googlebot decide to remove these pages from index? I thought Robots would stop and prevent indexation? But I've read this:
"Noindex is a funny thing, it actually doesn’t mean “You can’t index this”, it means “You can’t show this in search results”. Robots.txt disallow means “You can’t index this” but it doesn’t mean “You can’t show it in the search results”. I'm a bit confused about how to use these in both preventing duplicate content in the first place and then helping to address dupe content once it's already in the index. Thanks! B0 -
Yoast SEO Plugin: To Index or Not to index Categories?
Taking a poll out there......In most cases would you want to index or NOT index your category pages using the Yoast SEO plugin?
Intermediate & Advanced SEO | | webestate0 -
Linking across categories
On a website when I link across in the same category should all the categories all pear on each page. Let's say I have 6 categories and 6 pages should I have the 6 links on all the pages ( such as A, B, C, D, E, on page 1 ( let's imagine this page is page F ), then on page A have link B, C D, E, F and so on for the 6 pages ( meaning all the links appear on all the pages across the category ) or should i just have let's say 3 links on page 1 ( link A, B, C ) , then link ( D, E, F ) on page 2, then A, E, F on page 3, link B, C F on page 4 and so on... ( which means that i vary the links that appear and that it is naturally ( at least I think ) going to boost the link that appears the most of the 6 pages ? I hope this is not too confusing, Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Why should I add URL parameters where Meta Robots NOINDEX available?
Today, I have checked Bing webmaster tools and come to know about Ignore URL parameters. Bing webmaster tools shows me certain parameters for URLs where I have added META Robots with NOINDEX FOLLOW syntax. I can see canopy_search_fabric parameter in suggested section. It's due to following kind or URLs. http://www.vistastores.com/patio-umbrellas?canopy_fabric_search=1728 http://www.vistastores.com/patio-umbrellas?canopy_fabric_search=1729 http://www.vistastores.com/patio-umbrellas?canopy_fabric_search=1730 http://www.vistastores.com/patio-umbrellas?canopy_fabric_search=2239 But, I have added META Robots NOINDEX Follow to disallow crawling. So, why should it happen?
Intermediate & Advanced SEO | | CommercePundit0 -
Should I prevent Google from indexing blog tag and category pages?
I am working on a website that has a regularly updated Wordpress blog and am unsure whether or not the category and tag pages should be indexable. The blog posts are often outranked by the tag and category pages and they are ultimately leaving me with a duplicate content issue. With this in mind, I assumed that the best thing to do would be to remove the tag and category pages from the index, but after speaking to someone else about the issue, I am no longer sure. I have tried researching online, but there isn't anything that provided any further information. Please can anyone with any experience of dealing with issues like this or with any knowledge of the topic help me to resolve this annoying issue. Any input will be greatly appreciated. Thanks Paul
Intermediate & Advanced SEO | | PaulRogers0 -
How can I check if the FOLLOW,NOINDEX tag is working?
Hi everyone! After reading about pagination practices, a few days ago we introduced the <meta name="robots" content="FOLLOW,NOINDEX" /> tag, to prevent duplicate content. You can find an example below: http://www.inmonova.com/en/properties?page=2 I have been checking yahoo site explorer and result pages still get indexed. My question is: Am I doing something wrong? Is the code incorrect (follow,noindex - noindex,follow)? Or does it just take some time to have effect? Thanks in advance.
Intermediate & Advanced SEO | | inmonova0