Is there an SEO benefit to using tags in WordPress for my blog posts?
-
We have locations across the US and are trying to develop content so that we rank well for specific keywords on a local level. For instance, "long tail keyword search in state" or "long tail keyword search near 76244", etc. The goal is to develop those content pages via blogs to rank for those keywords. We are using Yoast and will be optimizing each post using that tool. My questions are:
1. Are there any benefits to adding a long list of tags to each post?
2. If yes, do I need to limit the number of tags?
3. Do we need to block the indexing of yoast to those tags and categories for duplicate content issues?Any insight on the best way to optimize these blog posts with the use of tags or other avenues would be greatly appreciated.
-
I agree with what Chris said tags are primarily there for users. They should not be indexed they are there to allow people to navigate your site more easily if you index them you will get duplicate content issues.
-
I appreciate the response Chris. It makes sense that the tags would be more an of internal IA piece. I have come across some blogs that mention SEO in association with Tags so I wanted to make sure. In my mind it would seem conflicting to have Yoast providing meta and the tags providing the same.
Do others of you out there agree with all of this as well?
-
You will want to be selective with the categories and tags that you use. If you have multiple tag index pages with the exact same content you may be at some risk of serving nearly identical pages. It's still not a huge SEO issue as Google will likely just prefer one over the other.
If it does become an issue where multiple tags or categories are identical you can block indexing as you described.
Tags are primarily a feature for users to navigate subjects on your blog rather than a method to improve SEO.
The content quality and title of your posts is much more important in terms of optimization. We recently discussed ways to optimize content for search rankings just yesterday on our blog. I would recommend you focus on your content and keyword targeting as described there.
Hope this helps. Feel free to reach out if you have any other questions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird SEO Problem - No Longer Ranking in Some Areas
Hi Everyone, I’ve got a weird SEO issue that I hope you’ll be able to help with. I’ve broken it down in to the key points below: Impressions for our primary and secondary keywords dropped dramatically on 02.10.17. Impressions have only dropped on non geographical keywords. “UK” variants are still ranking well. Investigation shows we’re not ranking outside of London at all for primary and secondary keywords. Primary and secondary keywords are still ranking well in London, the city where we’re based We’ve looked at our competition who do rank for the primary keyword both in and outside London. We noticed we have our “postaladdress” in our schema. The competition don’t have their address in their schema. We updated our schema 2 weeks ago and now use the Yoast schema which is the same as our competitors use. Approx 1 week after removing the schema we started showing up for primary and secondary keyword again, but very low - fluctuating between page 15 and page 24. It’s been 2 weeks now and no improvement. AHREFS and google webmaster, both incorrectly detail that we rank top 5. Which is true to a degree, but only in London. Thank you in advance!
Local Website Optimization | | rswhtn0 -
Does having an embedded Google Map still count as a positive SEO signal?
I know this was true a few years ago, however is there still an advantage to having an embedded map vs. a pop up map in 2017?
Local Website Optimization | | BigChad21 -
Subdomain vs. Separate Domain for SEO & Google AdWords
We have a client who carries 4 product lines from different manufacturers under a singular domain name (www.companyname.com), and last fall, one of their manufacturers indicated that they needed to move to separate out one of those product lines from the rest, so we redesigned and relaunched as two separate sites - www.companyname.com and www.companynameseparateproduct.com (a newly-purchased domain). Since that time, their manufacturer has reneged their requirement to separate the product lines, but the client has been running both sites separately since they launched at the beginning of December 2016. Since that time, they have cannibalized their content strategy (effective February 2017) and hacked apart their PPC budget from both sites (effective April 2017), and are upset that their organic and paid traffic has correspondingly dropped from the original domain, and that the new domain hasn't continued to grow at the rate they would like it to (we did warn them, and they made the decision to move forward with the changes anyway). This past week, they decided to hire an in-house marketing manager, who is insisting that we move the newer domain (www.companynameseparateproduct.com) to become a subdomain on their original site (separateproduct.companyname.com). Our team has argued that making this change back 6 months into the life of the new site will hurt their SEO (especially if we have to 301 redirect all of the old content back again, without any new content regularly being added), which was corroborated with this article. We'd also have to kill the separate AdWords account and quality score associated with the ads in that account to move them back. We're currently looking for any extra insight or literature that we might be able to find that helps explain this to the client better - even if it is a little technical. (We're also open to finding out if this method of thinking is incorrect if things have changed!)
Local Website Optimization | | mkbeesto0 -
Hreflang errors "no return tag" sitemap.xml , and local search landing page with wrong Languages
Really need help , our website when search in google(US) will provide global page (keyword:asus/asus zenfone3). and search console also return "no return tags"another wear thing is when use googlebot crawl sitemap.xml googlebot cannot finish the file less than a quarterCan you please advise on what needs to be edited or changed to make sure my implementation is correct and not returning errors?
Local Website Optimization | | June01270 -
International SEO - How to rank similar keys for differents countries
Hello MOZ friends.
Local Website Optimization | | NachoRetta
I work in an digital marketing agency in Argentina and since we have a lot of traffic from other Spanish-speaking countries like Mexico and Spain, we want to rank specific keywords for these countries.
We were thinking of putting new versions of the homepage in subfolders, for example /es/ for Spain, /mx/ to Mexico, etc. In these new subfolders we would place a very similar version of the homepage with a few minor modifications to work specific keywords in each country. For example, in Spain it is more searched "marketing online", and "marketing digital" is more used in Mexico and Argentina.
I have understood that to implement this we would be to place a label hrflang on the homepage directing visitors and crawlers to the correct version of each country. Is it ok?
Another concern is, whether they are very similar pages, Google does not take it as duplicate content ..
I read this:
https://moz.com/blog/the-international-seo-checklist
And i am not completely sure about using subfolders for each country, but i dont know how to position diferents keywords for diferent countries.
Regards,
Juan Ignacio Retta0 -
How to approach SEO for a national website that has multiple chapter/location websites all under different URLs
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national umbrella site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local website. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago.) We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1