Ranking Page - Category vs. Blog Post - What is best for CTR?
-
Hi,
I am not sure wether I shall rank with a category page, or create a new post. Let me explain...
If I google for 'Basic SEO' I see an article from Rand with Authorship markup. That's cool so I can go straight to this result because I know there might be some good insight. BUT: 'Basic SEO' is also an category at MOZ an it is not ranking.
On the other hand, if I google for 'advanced SEO' then the MOZ category for 'advanced SEO' is ranking. But there is no authorship image, so users are much less likely to click on that result.
Now, I want to rank for a very important keyword for me (content keyword, not transactional). Therefor, I have a category called 'yoga exercises'. But shall I rather create an post about them only to increase CTR due to Google Authorship?
I read in Google guidelines that Authorship on homepage an category pages are not appreciated.
Hope you have some insights that can help me out.
-
The basic theory is that as you go down a website's hierarchy, the more you go from short-tail, general themes to long-tail, specific things. Here's the rough idea:
- Home Page -- very general short-tail
- Blog (or other main section page) -- general and short-tail
- Post Category (or a section subpage) -- specific and long-tail
- Blog Post -- very specific and long-tail
Basically, items like blog posts should ideally be the best sources of authoritative information on very specific topics such as a post each on "international seo," "e-commerce seo," and "b2b seo." All of these posts could be within a category of "seo strategy" for which the category page would aim to rank.
As far as click-through rate -- it depends on the keyword. You should aim to give whatever will address the user intent behind a given search query. The more that the query is very, very specific, the more that it should probably be targeted by a specific post. The more that it is a general, informational query, the more that users may want to be taken to a collection of posts.
Now, I wouldn't know what to suggest for your website because I have not seen it. But I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
MOZ is showing that I have non- indexed blog tag posts are they supposed to be nonindexed. My articles are indexed just not the blog tags that take you to other similar articles do I need to fix this or is it ok?
MOZ is showing that my blog post tags are not indexed my question is should they be indexed? my articles are indexed just not the tags that take you to posts that are similar. Do I need to fix this or not? Thank you
Intermediate & Advanced SEO | | Tyler58910 -
Does you page need to be unique to rank
What I mean by unique is : Let's imagine I want to rank one "seo ranking factors." In order to compete do I need to have (in terms of design) that is totally different than everything out there or can I rank with a page that is presented in a very similar way than everything out there but with different content. Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Tool to help find blog / news pages?
Do you guys know of any tools where if I have a list of Url's it can help find blog and news pages and let me know which ones have these.
Intermediate & Advanced SEO | | BobAnderson0 -
Best practice to prevent pages from being indexed?
Generally speaking, is it better to use robots.txt or rel=noindex to prevent duplicate pages from being indexed?
Intermediate & Advanced SEO | | TheaterMania0 -
Article Falls After Maintaining ranks for years. Page penalty?
Hello, I have had an article consistently rank between 3-5 for the last two plu syears now. Recently it dropped down to 11-13. All I did was add my Google plus picture to it. I have been hearing things along the lines of content rewrites. I am well aware of the fact that there are many duplicates of my article are out there. Is this the legitament problem though? Those articles have links to my sites. I have even found other articles that link to my article that have been duplicated. So there's all sorts of duplicate syndication out there. Wondering if I should start asking people to take down my article. Any info on recent Google activity on this subject?
Intermediate & Advanced SEO | | imageworks-2612901 -
Removing hundreds of old product pages - Best process
Hi guys, I've got a site about discounts/specials etc. A few months ago we decided it might be useful to have shop specials in PDF documents "pulled" and put on the site individually so that people could find the specials easily. This resulted in over 2000 new pages being added to the site over a few weeks (there are lots of specials).
Intermediate & Advanced SEO | | cashchampion
However, 2 things have happened: 1 - we have decided to go in another direction with the site and are no longer doing this
2 - the specials that were uploaded have now ended but the pages are still live Google has indexed these pages already. What would be the best way to "deal" with these pages? Do I just delete them, do I 301 them to the home page? PS the site is build on wordpress. Any ideas as I am at a complete loss. Thanks,
Marc0 -
I try to apply best duplicate content practices, but my rankings drop!
Hey, An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated. http://www.domain.com.au/digital-inverter-generator-3300w/ and http://www.domain.com.au/shop/digital-inverter-generator-3300w/ The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option). This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc). Rankings went up and so did traffic. In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks. Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it? Ideas, suggestions?
Intermediate & Advanced SEO | | LukeyJamo0 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610