Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will deleting Wordpress tags result in 404 errors or anything?
-
I want to clean up my tags and I'm worried I'm going to look in my webmasters the next day with hundreds of errors. Whats the best way of doing this?
-
Hey Dan
Ha! Yes a lot changes but this is one thing that has stayed consistent - it's still totally fine to delete old tags. They will result in a 404, but those 404s are generally harmless!
-
8 years later, I learned something. Solid. Very solid. Thank you for sharing the knowledge, Dan!
8 years however is an eternity in the world of SEO. Are there any updates on your stance re. this problem in particular?
Following your guide and generating reports to put them in the "Nay"-bag, I want to delete them. I'm going to remove them from index first by nofollowing the tags taxonomy, and removing from XML sitemap.
So, the big "What if" ... What if I simply delete all of them after doing so?
-
Maybe 35 of them?
You can just leave those 35 indexed then.
I would just really like to get rid of most of them. It's so cluttered! Tags make up for nearly 60% of my site. That's no good.
Bear in mind you can delete from being visible in site navigation (remove tag clouds, tags from footers, tags at bottoms of posts) but the tag archives will still actually exist. So you can remove the links on-site without actually deleting the pages.
Like you siad in your posts, tag archives can be thin and Google is not down with that. I want to merge the tags with that Yoast Term Optmizer tool but it isn't available anymore
I'm worried about doing redirects because that is a lot of redirects!
If you remove tags from on-page links and noindex the ones that aren't bringing traffic that should solve things. 35 redirects for the rest that are bringing traffic is not too many
I also have a question on categories. I currently have three categories each with 12 subcategories. The subcategories are city names. (I work for a bar that has 12 locations). Is that bad form? Would you set it up differently? And if I have a category named nightlife and a subcategory under it named Baltimore, should i check them both for a Baltimore nightlife post or just the Baltimore part?
I would think about how often you'll post in each category. What you don't want is a category with only a post or two sitting in it for months or years. If you think, after a year, each category and sub category will have say 7-10+ posts in it, than it should be fine. It's all about having each category archive be full of content and unique!
-
Maybe 35 of them? I would just really like to get rid of most of them. It's so cluttered! Tags make up for nearly 60% of my site. That's no good. Like you siad in your posts, tag archives can be thin and Google is not down with that. I want to merge the tags with that Yoast Term Optmizer tool but it isn't available anymore
I'm worried about doing redirects because that is a lot of redirects!
I also have a question on categories. I currently have three categories each with 12 subcategories. The subcategories are city names. (I work for a bar that has 12 locations). Is that bad form? Would you set it up differently? And if I have a category named nightlife and a subcategory under it named Baltimore, should i check them both for a Baltimore nightlife post or just the Baltimore part?
-
Hey There
The post suggests keeping specific tags that are receiving traffic indexed. You can do this on a tag by tag basis with Yoast. The post also does not recommend deleting tags, just noindexing them.
I would suggest keeping tags that are bringing the most traffic indexed, while just noindexing the rest. Do not delete them. Out of your 1,000+ tags - how many are responsible for the 700+ visits?
There's also the option to 301 redirect tags to the most relevant post or category. But again, I would only do this with tags that aren't bringing substantial traffic.
-Dan
-
Dan,
I read your post and did all the spreadsheet stuff and I am still hesitant to noindex my tags. My blog has 321 posts with 1,240 tags. I understand that this is way too many and that I need to cut down. However, I am worried about losing the traffic I am getting from these tags, especially if I noindex them as you suggest.
About 3% of my monthly non-paid search traffic is coming from these tags. The bounce rate is 55%, which is only slightly greater than the site average. So on average monthly, about 715 people come in from the tags without bouncing, which is a little over 2% of my total non-bounce traffic. While it is only 2%, it is still a good amount of people. I don't want to lose this traffic. If I delete these tags and noindex the remainders, how can I expect to recover for this traffic?
-
Fortunately I predicted this question a year and nineteen days ago: http://www.evolvingseo.com/2012/08/10/clean-sweep-yo-tag-archives-now/
No seriously, you just want to run that analysis to make sure you're not killing a random tag or two with some traffic or links. In which case you can selectively delete and/or noindex.
-Dan
-
Wissam
Thanks for the most excellent link to the John Mueller post. I learned something new today thanks to you.
Best,
Robert
-
I wont worry about them, its totally fine to have 404 errors ... make sure you dont have any quality links pointing to these pages ( i dnt think there will be ) . but other than that dont worry about it
On another note, please check this G+ Post from John Mueller (Googler)
https://plus.google.com/+JohnMueller/posts/RMjFPCSs5fm
hope it helps
cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Showing 404 errors for product pages not in sitemap?
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url). Is this expected? Will these errors eventually go away/stop being monitored by Google?
Technical SEO | | woshea0 -
Should we set up redirects for all deleted TAGS?
We recently found our site had 65,000 tags (yes 65K). In an effort to consolidate these we've started deleting them. MOZ is now reporting a heap of 404 errors for tag pages. These tag pages should not have links to them so not sure how come they're being crawled. Any suggestions from experience in this area would be useful.
Technical SEO | | wearehappymedia0 -
Deleting Tags Properly - Advice Needed
I have over 18,000 tags. Needless to say, most of them are relatively useless to the user and generate no traffic, while cluttering the site. (I use Wordpress.) My plan is to delete tags, but I want to do so safely as to not accumulate website errors. (Tags pages are noindexed.) What process should I take here? Here was my basic plan (any help is appreciated). 1. Find irrelevant tags that are connected with hardly any posts. 2. Go into the post, and remove said tag. 3. Now, with a tag having a 'count' of 0, I go into Tags, and delete it. Safe, right? But now it seems those tag pages just turned into 404s "Uh-oh...Page not found!" Where do I go from here? Create 410's? Thanks Mike
Technical SEO | | naturalsociety0 -
Schema Markup Errors - Priority or Not?
Greetings All... I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn. I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them. My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have. Any thoughts?
Technical SEO | | AfroSEO0 -
Removing Media from Wordpress
I've run the seomoz on page report and found an interesting issue. I'm using wordpress and it seems that every picture I add to my articles seem to be added as separate pages to the site. I'm having to go to each and every picture and creating a meta tag and description to it. I still get duplicate content issues with the same. On my Disqus system, I get the same pictures added just as a page or article would look like. What can I do to avoid this?
Technical SEO | | emasaa0 -
Authorship and Publisher on WordPress
I successfully enabled rel=publisher on our WordPress blog, and as a test I also enabled rel=authorship for a set of blog posts. (Tested both in Google's Rich Snippets Tester.) However, on the individual blog posts the publisher credit disappears. Is there a way to enable both to appear on blog posts?
Technical SEO | | ufmedia0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
No Search Results Found - Should this return status code 404?
A question came up today on how to correctly serve the right status code on pages where no search results are found. I did a couple searches on some major eccomerce and news sites and they were ALL serving status code 200 for No Search Results Found http://www.zappos.com/dsfasdgasdgadsg http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=sdafasdklgjasdklgjsjdjkl http://www.ebay.com/sch/i.html?_trksid=p5197.m570.l1313&_nkw=dfjakljgdkslagklasd&_sacat=0 http://www.cnn.com/search/?query=sdgadgdsagas&x=0&y=0&primaryType=mixed&sortBy=date&intl=false http://www.seomoz.org/pages/search_results?q=sdagasdgasdgasg I thought I read somewhere were it was recommended to serve a status code 404 on these types of pages. Based on what I found above, all sites were serving a 200, so it appears this may not be the best practice. Any thoughts?
Technical SEO | | WEB-IRS0