Pointless Wordpress Tagging: Keep or unindex?
-
Simple as that.
Pointless random tags that are serving no purpose other than adding apparent bulk to a website. They are just showing duplicate content and literally are random keywords that serve almost no purpose. And the tags, for the most part are only used on one page.
If I remove them however, they will probably drop our site from around 650 pages to 450 (assuming I keep any tags that were used more than once).
I have read through some of the other posts on here and I know that Google will do some work as far as duplicate content is concerned. Now as far as UX is concerned, all these tags are worthless. Thoughts?
-
I think that it would be nice to see more.... "This was my problem, and here is how I fixed it" posts on YouMoz.
Keep up the great work!
-
YES! I was thinking about turning this whole thing in to a blog post about what I did to fix it. I'm trying to come up with a catchy title.
"The webs we weave, when we practice to deceive"
That may be a little too on the nose. But I think you catch the drift!
Thanks again to both of you!
-
Nice work!
-
Sounds like you've been busy! Will be interesting to see how all of this plays out over the next few months. Keep notes of what you've done, could make for a good blog post.
-
Update
I have officially removed ALL TAGS. I also found out that our previous web guy had placed some syntax in the .htaccess which was altering the way the URL was displaying domains, so in the url, there was no "tag" pages. Which was news to me! Previous I had been using a quick redirect plugin, but with the amount of 301's I figured it was time to upgrade to .htaccess and plug them in there.
It has been a few weeks since the changes.
- Pages Crawled 194
- High Priority Issues: 9
- Medium Priority Issues: 324 (although I know what the cause of this was, and won't be in the next audit)
And for the first time in a little over a month that I have been working here we finally saw green in the amount of traffic that we are getting! We are up 4%
Thanks again for all your help!
-
UPDATE
So as of 7/9 I have removed 423 Tags and redirected them to 2 of the our main tags.
I isolated the 2 most effective tags and spread them across the 160 posts that we have. I limited it to one tag per post as to not create duplicate content, and because I am leaning towards removing tags all together.
Part of the other reason that I am doing this is because our categories seem to be gaining more traction than our tags.
My thought moving forward would be to take all of these, and fold them into the categories.
My Audit reads as follows:
- Pages Crawled dropped from 624-240
- High Priority Issues Dropped from 42-4
- Medium Priority Issues Dropped from 141-102
It is much to early to tell how things are going with traffic. Our visits are down 2% (and nobody checks into rehab until at least a week or 2 after the holiday weekend) but our keywords sending visits are up 9.
I will keep you all informed!
-
As much as I wish that these pages were not a problem, given the fact that I could probably write a blogella (a short story blog) about just how messy this website I inherited actually is, I am inclined to think that they are doing more harm then they are helping. Our numbers are staggering,
We have 660 indexed pages, As you can see 440 of them are from wordpress tags.
From our links we have 145 different root domains that account for 11,000 inbound links
We have something like 18K internal links.
Things are not good over here.
Almost none of the pages have meta tags, alt tags, proper h1, h2, h3 etc,
-
For example we have 27 articles, and 440 tags.... that should give you some insight as to the website I'm trying to clean up....
With this information. I can say that I would delete all 440 tags if this was my website. I would not need to think about it. These pages are going to be duplicate content and dangerous to the health of the site. They will also be a power sink.
-
That's skew any way you look at it. But still. Put them all in a secondary sitemap.html so they are not orphans, remove from sitemap.xml, place noindex in the HTML HEAD and still try to consolidate where possible.
In general we do not want to get rid of pages that are not a problem as they can receive organic traffic for not targeted keywords that we have no real other way of discovering. The web is an organic momentum flux and is not a solid state structure. It needs some degree of unintended and not calculated behavior also in website structure. Otherwise the sum of the parts of all pages in google would equal the value of google which is not the case. google connects dot's, we interpret and find new meaning and relations translating into traffic we did not expect.
The momentum flux is a joke of course. It's a quantum state of course
-
I don't want to speak for EGOL here, but I don't think he is suggesting CUT everything. What I got from his post was pull what's worthless and redirect (or as you say consolidate) to whats worthwhile.
The webmaster before me was writing articles with a Spinner. At least I believe he was, so we end up with a Title and then 30 WP Tags. Of the 30 keywords maybe 5 will be tagged to 5 other posts, 5 will be tagged in at least one other post, 2 will be branded and the rest are 1 off keywords that are very random and are almost partial sentences.
For example we have 27 articles, and 440 tags.... that should give you some insight as to the website I'm trying to clean up....
-
No do not use a 301 and certainly do not remove any pages from the index as mentioned here. That's foolish and uncalled for and potentially harmful against zero to no risk if you would let them be and only make them less prominent to users of the site. And if you really feel you need to cut drastically in the number of tag pages then use rel=canonical instead of a 301!
Consolidate not decimate! When we 301 we assimilate pages to 1 page. We say the old page is gone for good and the new 1 is the new page for the old link. This diffuses the keyword that the page was found for as it melts all different pages that 301 to a page into 1. However when we use a canonical url we consolidate the pages into 1 new one that bundle the old. When we search for a page that has and canonical to a other page it still ranks next to the new page for a while. Only the title in search is the same as the page referred to with the canonical. With the 301 it will disappears completely from the index and google cache along with it's internal keyword binding it hat before. So use canonical not 301! And my advice: consolidate to 1 useful tag page with a real body of work and optimize this for a primary keyword like 'seo news' or something and leave the pages with the 301 be but don't link to them anymore from then on.
Hope this is helpful.
Gr Daniel
-
Powerful quote regarding Google / Search Engine dependency!
-
"I don't trust having Google do stuff for me that I could do myself, because plenty of times Google says how they are gonna do things and then change their mind without tellin' anybody." -EGOL
That may be one of the best industry quotes I have ever read....
-
I would do a 301.
If you use the URL removal tool that only works for google.
If you do a 301 that is on your server and every attempt to access that page goes where you want it.
I don't trust having Google do stuff for me that I could do myself, because plenty of times Google says how they are gonna do things and then change their mind without tellin' anybody.
-
EGOL strikes again! That was my thought.
It would be better to remove the tags and do a 301 as opposed to remove them from the index with the URL Removal tool? Or are you saying add a 301 to them?
-
Go into your analytics and see if they are pulling any traffic from search. See if they are pulling in any traffic from referrals or social media.
I am willing to bet that those pages are dead weight.
This types of pages do not exist on any of my sites.
So, if you find in the analytics that these pages are dead weight then delete them and use 301 redirects, and turn them off in your content manager.
If you have content that you want to promote or that lots of people are lookin' at, then give those pages links in obvious locations on every page of your website. People will look at that... they will rarely click a tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
Can I leave off HTTP/HTTPS in a canonical tag?
We are working on moving our site to HTTPS and I was asked by my dev team if it is required to declare HTTP or HTTPS in the canonical tag? I know that relative URL's are acceptable but cannot find anything about HTTP/HTTPS. Example of what they would like to do Has anyone done this? Any reason to not leave off the protocol?
White Hat / Black Hat SEO | | Shawn_Huber0 -
WordPress Plugin Backlink?
When developing a WordPress plugin, is it OK to include a dofollow backlink with the name of the source site as the anchor text? Or would Google consider this spammy?
White Hat / Black Hat SEO | | JABacchetta0 -
NoFollow tag for external links: Good or bad?
I have a few sites that have tens of thousands of links on them (most of them are sourcing images that happen to be external links). I know that it's a good thing to externally link to reputable sources, but is it smart to place the nofollow tag on ALL external links? I'm sure there is a good chance that external links from posts from years ago are pointing to sites that may now be penalized. I feel as though nofollowing all the external links could come off as unnatural. What are the pros and cons of placing the nofollow tag on ALL external links, and also if I leave it as is and don't put the nofollow tag on them. Thanks.
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Is it still considered reciprocal linking if one of the links has a nofollow tag?
I have a popular website in which I include nofollow links to many local businesses, like restaurants and retailers. Many of the businesses are local startups that are more focused on word of mouth and often have no idea what SEO is. Seeing as I am already mentioning them on my website and my readers are finding them via the links, I want to reach out to these businesses to see me if they might give me a link since I have been linking to them for years. My question is: If these business owners decide to link to my wesbite and they give me a 'followed' link, will this look like reciprocal linking in the eyes of search engines? In other words, does the nofollow tag I put on my links to other businesses negate the reciprocal link penalty since both parties are not benefiting from a link juice exchange?
White Hat / Black Hat SEO | | AndrewHill0 -
Can you have too many NOINDEX meta tags?
Hi, Our magento store has a lot of duplicate content issues - after trying various configurations with canonicals, robots, we decided it best and easier to manage to implement Meta NOINDEX tags to the pages that we wish the search engines to ignore. There are about 10000 URL's in our site that can be crawled - 6000 are Meta No Index - and 3000 odd are index follow. There is a high proportion of Meta No Index tags - can that harm our SEO efforts? thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
Multiple H1 tags are OK according to developer. I have my doubts. Please advise...
Hi, My very well known and widely respected developer is using multiple H1 tags I see - they like using them in their code and they argue multiple H1s conform with HTML5 standards. They are resisting a recode to one H1 tag per page. However, I know this is clearly an issue in Bing, so I don't want to risk it with Google. Any thoughts on whether it's best to avoid multiple H1 tags in Google (any evidence and reasoning would be great - I can then put that to my developer...) Many thanks for your help, Luke
White Hat / Black Hat SEO | | McTaggart0