Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Tags on WordPress Sites, Good or bad?
-
My main concern is about the entire tags strategy. The whole concept has really been first seen by myself on WordPress which seems to be bringing positive results to these sites and now there are even plugins that auto generate tags.
Can someone detail more about the pros and cons of tags? I was under the impression that google does not want 1000's of pages auto generated just because of a simple tag keyword, and then show relevant content to that specific tag. Usually these are just like search results pages... how are tag pages beneficial?
Is there something going on behind the scenes with wordpress tags that actually bring benefits to these wp blogs? Setting a custom coded tag feature on a custom site just seems to create numerous spammy pages. I understand these pages may be good from a user perspective, but what about from an SEO perspective and getting indexed and driving traffic...
Indexed and driving traffic is my main concern here, so as a recap I'd like to understand the pros and cons about tags on wp vs custom coded sites, and the correct way to set these up for SEO purposes.
-
I approve of this comment
-
Hey There
For the most part, it is not a good idea to just use tags as a way to try and gain search traffic from them. They are possibly beneficial to users internal to your site. Users may read an article and want to read other similar articles, so having a few tags at the bottom of the post can be useful. Putting tags in your sidebar for navigation is rarely useful, but if done in a somewhat user friendly way it could work. I generally avoid "tag clouds" or having dozens of tag links in one spot.
In terms of the tag archives themselves (like mysite.com/blog/tag/tag-name/), tag archives rarely look different than posts themselves or other archives. Unless you have a giant site, with so many posts, and tags actually add a beneficial way to scroll through archives on a very specific topic, categories do this fine enough.
And for indexation - if it's a new site or a site that has NOT ever indexed tags I would advise to not index them moving forward either. Unless in a rare .5% of cases this is done in an extremely intentional and strategic way, not for SEO but for users and site architecture. (Think of a site like Smashing Magazine or Search Engine Land with LOTS of content, that's a rare edge case where using tags for navigation and architecture might make sense.)
If you HAVE indexed tags already I wrote an article on how to safely evaluate and noindex them.
In general, I would avoid tagging a post with more than 4-5 tags. Tags should always be different from categories (like more specific things).
-Dan
-
So much depends on how you've implemented tags on your site and who your audience is.
It can be tempting to implement tags to try and make up for a broken categorisation and it's tempting to add tags to a page because they mention a topic rather than because it's actually relevant to that tag.
It worth taking a look at your analytics to see if (and how) your visitors are using your tag pages. I've see many sites where visitors just don't use the tags (there too many, they're meaning less, or even they are not obviously links!) and a lot of this depends on just how many tags your using, how meaningful these tags are to people and the relevance and quality of the articles you have associated with each tag.
Have you got internal search set up on your site and are you capturing the search data in your analytics? This can provide some great insights in what people are struggling to find on your site and what they expect to find. It can also highlight areas where your IA isn't working.
(As James mentioned) If your tag pages are indexed, and getting inbound search traffic then segment your non-paid search traffic and look at the bounce rate and other engagement metrics. How valuable is this traffic to you, and how relevant are they finding your tag pages as the answer to their query?
-
Also check how much traffic the tags are currently getting, one site I have looked at in the past had like 16k uv a month from some tags on the site so proceed with caution also I agree with the advice above as well.
-
Hey there
Dan Shure wrote this fantastic Wordpress optimisation guide here on the Moz blog a year or so ago and it is still very relevant for today. In that post, he goes into depth about the problems with tags and what your best practice should be. Usually, you want to noindex the tags on your WP site - keep them for navigation purposes if you want, but letting them be indexed can lead to duplicate content issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Client Wants To Use A .io Domain Name - How Bad For Organic?
Hi, I have a U.S. client who is stuck on a name that he wants to get as a .io (British Indian Ocean) domain name for a new site. Aside from the user confusion/weirdness, how much harder do you think this makes this sites organic in the U.S. in the future with a .io domain name? FYI, the other part of the domain name he wants to use is short, meaningless and implies nothing in and of itself. Thanks!
White Hat / Black Hat SEO | | 945012 -
Should I delete older posts on my site that are lower quality?
Hey guys! Thanks in advance for thinking through this with me. You're appreciated! I have 350 pieces of Cornerstone Content that has been a large focus of mine over the last couple years. They're incredibly important to my business. That said, less experienced me did what I thought was best by hiring a freelance writer to create extra content to interlink them and add relevancy to the overall site. Looking back through everything, I am starting to realize that this extra content, which now makes up 1/3 my site, is at about 65%-70% quality AND only gets a total of about 250 visitors per month combined -- for all 384 articles. Rather than spending the next 9 months and investing in a higher quality content creator to revamp them, I am seeing the next best option to remove them. From a pros perspective, do you guys think removing these 384 lower quality articles is my best option and focusing my efforts on a better UX, faster site, and continual upgrading of the 350 pieces of Cornerstone Content? I'm honestly at a point where I am ready to cut my losses, admit my mistakes, and swear to publish nothing but gold moving forward. I'd love to hear how you would approach this situation! Thanks 🙂
White Hat / Black Hat SEO | | ryj0 -
Good vs Bad Web directories
Hi this blog post Rand mentions a list of bad web directories - I asked couple of years ago if there is an updated list as some of these (Alive Directory for example) do not seem to be blacklisted anymore and are coming up in Google searches etc? It seems due to old age of the blog post (7 years ago ) the comments are not responded to. Would anyone be able to advise if which of these good directories to use? https://moz.com/blog/what-makes-a-good-web-directory-and-why-google-penalized-dozens-of-bad-ones
White Hat / Black Hat SEO | | IsaCleanse0 -
How to make second site in same niche and do white hat SEO
Hello, As much as we would like, there's a possibility that our site will never recover from it's Google penalties. Our team has decided to launch a new site in the same niche. What do we need to do so that Google will not mind us having 2 sites in the same niche? (Menu differences, coding differences, content differences, etc.) We won't have duplicate content, but it's hard to make the sites not similar. Thanks
White Hat / Black Hat SEO | | BobGW0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Finding and Removing bad backlinks
Ok here goes. Over the past 2 years our traffic and rankings have slowly declined, most importantly, for keywords that we ranked #1 and #2 at for years. With the new Penguin updates this year, we never saw a huge drop but a constant slow loss. My boss has tasked me with cleaning up our bad links and reshaping our link profile so that it is cleaner and more natural. I currently have access to Google Analytics and Webmaster Tools, SEOMoz, and Link Builder. 1)What is the best program or process for identifying bad backlinks? What exactly am I looking for? Too many links from one domain? Links from Low PR or low “Trust URL” sites? I have gotten conflicting information reading about all this on the net, with some saying that too many good links(high PR) can be unnatural without some lower level PR links, so I just want to make sure that I am not asking for links to be removed that we need to create or maintain our link profile. 2)What is the best program or process for viewing our link profile and what exactly am I looking for? What constitutes a healthy link profile after the new google algorithm updates? What is the best way to change it? 3)Where do I start with this task? Remove spammy links first or figure out or profile first and then go after bad links? 4)We have some backlinks that are to our old .aspx that we moved to our new platform 2 years ago, there are quite a few (1000+). Some of these pages were redirected and some the redirects were broken at some point. Is there any residual juice in these backlinks still? Should we fix the broken redirects, or does it do nothing? My boss says the redirects wont do anything now that google no longer indexes the old pages but other people have said differently. Whats the deal should we still fix the redirects even though the pages are no longer indexed? I really appreciate any advice as basically if we cant get our site and sales turned around, my job is at stake. Our site is www.k9electronics.com if you want to take a look. We just moved hosts so there are some redirect issues and other things going on we know about.
White Hat / Black Hat SEO | | k9byron0 -
Title Tag - Best Practices
I'm pretty new to seo but think I'm starting to get a decent grasp on it. One thing I'm really struggling with is how to organize the meta title tags on my website. I work in real estate and I'm noticing a lot of my local competitors that are ranking for the top keywords seem to using that particular keyword on every title tag within their website. An example would be www.paranych.com. Many of his internal pages have the word "Edmonton Real Estate" in the meta title tag, yet his home page is the page that is ranking for that particular keyword. It doesn't seem logical to have every one of my pages featuring the same keyword, but there are many examples within my industry of this working. Is the best practice with meta title tags to have your keyword on every title tag of your site or just the home page? Thx, Barry
White Hat / Black Hat SEO | | patrickmilligan0 -
Subdomains vs. Subfolders Wordpress Multisite
I am in the process redesigning my organization's website using wordpress multisite. I am currently planning on creating subdomains for each of the locations, as I thought that having a keyword saturated domain name would provide the best rankings. So the Omaha office would look like this: omaha.example.com Would it be better to go with example.com/omaha? Things to consider: Google adwords is currently a huge source of our traffic. Despite having very good organic rankings, we receive most of our traffic from pay-per-click sources. The "display URL" has dramatic effect on our CTR, so I want to avoid subfolders if possible. (example OmahaEmergencyDental.com receives far more click thru's than EmergencyDental.com) Each location currently has it's own domain and website (omahaemergencydental.com) these sites/pages have been in place for several years Thanks in advance!
White Hat / Black Hat SEO | | LoganYard0