Value in adding rel=next prev when page 2-n are "noindex, follow"?
-
Category A spans over 20 pages (not possible to create a "view all" because page would get too long). So I have page 1 - 20. Page 1 has unique content whereas page 2-20 of the series does not. I have "noindex, follow" on page 2-20. I also have rel=next prev on the series.
Question: Since page 2-20 is "noindex, follow" doesn't that defeat the purpose of rel=next prev? Don't I run the risk of Google thinking "hmmm….this is odd. This website has noindexed page 2-20, yet using rel=next prev."
Even though I do not run the risk, what is my upset in keeping rel=next prev when, again, the pages 2-20 are noindex, follow.
thank you
-
I don't see an upset for keeping rel=next prev, I see only an upside.
Google doesn't have to listen to the noindex tag, although they almost always do. Furthermore, adding the rel=next prev tag only makes your code even more correctly developed, and since Google isn't the only service that looks for tags like this I would feel more comfortable with it present.
From an accessibility standpoint, it makes sense to have the rel=next prev tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HELP: Why do I have a 61% score for "% of total links, external + follow"?
Firstly, I understand what this percentage is. It's the ratio of external links that are "follow" -> compared to the links that are "no-follow". Four questions: This is definitely not accurate! I have loads of no-follow links Does anyone have ideas or techniques to add more healthy no-follow links? Am I completely misunderstanding this? Will this high score negatively affect my ranking? I could definitely use some help. Thanks so much in advance. I don't think my website address should help, but if you need it for context, it's estatediamondjewely.com.
Intermediate & Advanced SEO | | SamCitron0 -
Blog page and homepage ranking next to each other for same keyword
Hello, I have my homepage that has been existing for 10 years that is ranked in 18 th position on google for the keyword luxury bike tours. This homepage doesn't have any external link or internal links saying luxury bike tours and nowhere in the title or on the page do I have the word luxury. I only have the words bike and tours. I created a blog page 24 hours ago that has the word luxury, bike and tours in the title and it is ranked in 19 th position just behind my homepage. I am wondering how it can be there and my homepage just be one spot above with all the history and linking it has ? Is it due to the fact that I have the word luxury in the title ? Is it just because my internal linking structure is correct and this blog page is brand new and will my homepage rank higher in the near future but see that I just redid the structure I need to wait a few months ?
Intermediate & Advanced SEO | | seoanalytics0 -
Do I miss traffic (thus, page value) by using the GWMT Parameter Handling Tool?
I'm working through duplicate content issues. The tracking code or the session id in the URL is being recognized as a different page than the original. Example: www.example.com is dup content to www.example.com?_nk=x&ad=y&_ga=z, which is tied to a marketing campaign If my setup in the URL parameter tool is set to: Effect = None Crawl = Representative URL, then do I: 1. Miss all the traffic being driven to the ?_nk page?
Intermediate & Advanced SEO | | johnnybgunn
2. With a Rep URL, there still would be two indexed listings: the .com & the .com?_nk...right? Neither is good. Redirects of all the URLs is not an option b/c there are hundreds of these that would need to be redirected. And I also don't want to slow down page load time with excessive redirects, which has been the case when adding 100+ redirects for the recent website migration we did.0 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
Change of URLs: "little by little" VS "all at once"
Hi guys, We're planning to change our URLs structure for our product pages (to make them more SEO friendly) and it's obviously something very sensitive regarding the 301 redirections that we have to take with... I'm having a doubt about Mister Google: if we slowly do that modification (area by area, to minimize the risk of problems in case of bad 301 redirection), would we lose rankings in the search engine? (I'm wondering if they might consider our website is not "coherent" -> not the same product page URLs structure for all the product pages during some time) Thanks for your kind opinion 😉
Intermediate & Advanced SEO | | Kuantokusta0 -
"Too many links" - PageRank question
This question seems to come up a lot. 70 flat page site. For ease of navigation, I want to link every page to one-another. Pure CSS Dropdown menu with categories - each expanding to each of the subpage. Made, implemented, remade smartphone friendly. Hurray. I thought this was an SEO principle - ensuring good site navigation and good internal linking. Not forcing your users to hit "back". Not forcing your users to jump through hoops. But unless I've misread http://www.seomoz.org/blog/how-many-links-is-too-many then this is something that's indirectly penalised by Google because a site with 70 links from its homepage only lets each sub-page inherit 1/80th of its PageRank. Good site navigation vs your subpages are invisible on Google.
Intermediate & Advanced SEO | | JamesFx0 -
Pagination Question: Google's 'rel=prev & rel=next' vs Javascript Re-fresh
We currently have all content on one URL and use # and Javascript refresh to paginate pages, and we are wondering if we transition to the Google's recommended pagination if we will see an improvement in traffic. Has anyone gone though a similar transition? What was the result? Did you see an improvement in traffic?
Intermediate & Advanced SEO | | nicole.healthline0 -
So what exactly does Google consider a "natural" link profile?
As part of my company's ongoing SEO effort we have been analyzing our link profile. A colleague of mine feels that we should be targeting at least 50% branded anchor text. He claims this is what search engines consider "natural" and we should not go past a threshold of 50% optimized anchor text to make sure we avoid any penalties or decrease in rankings. 50% brand term anchor text seems too high to me. I pointed out that most of our competitors who outrank us have a much greater percentage of optimized links. I've also read other industry experts state that somewhere in the range of 30% branded anchor text would be considered natural. What percent of branded vs. optimized anchor text do you feel looks "natural" and what do you base your opinion on?
Intermediate & Advanced SEO | | DeannaTallman0