Can using nofollow on magento layered navigation hurt?
-
Howdy Mozzers!
We would like to use no follow, no index on our magento layered navigation pages after any two filters are selected. (We are using single filter pages as landing page, so we would liked them indexed)
Is it ok to use nofollow, noindex on these filter pages? Are there disadvantages of using nofollow on internal pages?
Matt mentioned refraining from using nofollow internally https://www.youtube.com/watch?v=4SAPUx4Beh8
But we would like to conserve crawling bandwidth and PR flow on potentially 100's of thousands of irrelevant/duplicate filter pages.
-
I understand I might be a little late, but I had experienced this issue first hand with a Magento site. Once I added a wildcard exclusion in the robots.txt file my impressions and click improved noticeably.
-
HI,
That is quite a few pages!
If the main issue is crawling related then robots.txt is probably the best way to go, I think the meta tags will still allow the pages to be crawled (they have to be for the tag to be read). Check out the comments in this and this post for wildcard matching in robots.txt which should do what you need. If the pages are indexed then it might be wise to leave a bit of time so that the noindex tags are picked up and then implement the crawl blocking in the robots.txt (and test in GWT to make sure you are not accidentally blocking more then you think). In this case I think you could still leave out the nofollow meta tag but this might just be personal opinion now - I'm not sure if in practice it would make much difference once you have no indexed and blocked crawling!
-
Hi Lynn,
Thank you for your valuable input on the matter. Yes, using meta tags in the header. We are currently submitting filter pages that we want indexed through the site map, so google bot should be able to reach these pages. Also, we are displaying noindex, nofollow tags only on filter pages which have a combination of more than two filters selected as we do not need to go any deeper than that.
I understand your point of using noindex, follow instead of noindex, nofollow to prevent unexpected crawl issues. But on the contrary, don't you think we could conserve crawling bandwidth using noindex, nofollow tags on filter pages that serve no purpose being crawled and probably wont be externally linked to either?
We currently have around 7 filters, some with many values. This can create combinations of more than 500,000 filter pages...
Thanks
-
Hi,
I assume you mean in a meta header tag for these pages? As a general rule I would avoid using nofollow and simply noindex the pages in question. If you are implementing this with a meta tag then the pages will be reached from the layered navigation links anyway so they would then be a dead end for both PR and the crawler - with the potential to cause unexpected crawl problems rather than optimising it.
As long as you are addressing as best you can any duplicate content issues caused by the layered navigation (check out this post for a good rundown on the various solutions) then I would leave the noindex in place and let the crawler follow the links as normal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a Swiss mailing address hurt my ranking in the USA
Hello, I have a US phone number on my website because all my clients are in the USA but a Swiss address on my website because I am based in Switzerland. I also registered my website on google my business with a Swiss address. I noticed that I rank fairly well (on the 1 st page for many keywords in Switzerland when searching from my location (Switzerland) on google.com but when I connect to adwords (choosing USA) or with a proxy in the USA to check my ranking I see myself nowhere on the 1 st page.Why is that ? is it because I have a Swiss address ?Is the fact that I have a Swiss address and I am based in Switzerland hurt my ranking in the USA ?Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Internal Linking - Can You Over Do It?
Hi, One of the sites I'm working on has a forum with thousands of pages, amongst thousands of other pages. These pages produce lots of organic search traffic... 200,000 per month. We're using a bit of custom code to link relevant words and phrases from various discussion threads to hopefully related discussion pages. This generates thousands of links and up to 8 in-context links per page. A page could have anywhere from 200 to 3000 words in one to 50+ comments. Generally, a page with 200 words would have fewer of these automatically generated links, just because there are fewer terms naturally on the page. Is there any possible problem with this, including but not limited to some kind of internal anchor text spam or anything else? We do it to knit together pages for link juice and hopefully user experience... giving them another page to go to. The pages we link to are all our pages that produce or we hope to produce organic search traffic from. Thanks! ....Darcy
Intermediate & Advanced SEO | | 945010 -
What a PBN is? please describe how you use them for SEO.
what a PBN is? please describe how you use them for SEO.
Intermediate & Advanced SEO | | Green.landon0 -
Should I use meta noindex and robots.txt disallow?
Hi, we have an alternate "list view" version of every one of our search results pages The list view has its own URL, indicated by a URL parameter I'm concerned about wasting our crawl budget on all these list view pages, which effectively doubles the amount of pages that need crawling When they were first launched, I had the noindex meta tag be placed on all list view pages, but I'm concerned that they are still being crawled Should I therefore go ahead and also apply a robots.txt disallow on that parameter to ensure that no crawling occurs? Or, will Googlebot/Bingbot also stop crawling that page over time? I assume that noindex still means "crawl"... Thanks 🙂
Intermediate & Advanced SEO | | ntcma0 -
Using subdomains for related landing pages?
Seeking subdomain usage and related SEO advice... I'd like to use multiple subdomains for multiple landing pages all with content related to the main root domain. Why?...Cost: so I only have to register one domain. One root domain for better 'branding'. Multiple subdomains that each focus on one specific reason & set of specific keywords people would search a solution to their reason to hire us (or our competition).
Intermediate & Advanced SEO | | nodiffrei0 -
How can we improve the seo on our site?
Hello everyone. I have been reading through this site for a while and tried to put everything together that I have learned so far. Would any of you mind looking at our site and providing any pointers or areas we can still improve on or areas I completely missed. I appreciate any feedback you can give! Our site is faithology.com Thanks again! Brandon
Intermediate & Advanced SEO | | BMPIRE0 -
What type of links should be followed and nofollowed internally?
We have submitted our sitemap.xml to search engines so now that they have that should we use a nofollow attribute on the sitemap.html? Do we even need a sitemap.html? For other links on the site such as: Contact us About Us Locations and other phrases that we are not trying to rank for should we set these to nofollow?
Intermediate & Advanced SEO | | SEODinosaur0 -
Will using a service such as Akamai impact on rankings?
Howdy 🙂 My client has a .com site they are looking at hosting via Akamai - they have offices in various locations, e.g UK, US, AU, RU & in some Asian countries. If they used Akamai, would the best approach be to set up seperate sites per country: .co.uk .com .com.au .ru .sg etc Although my understanding is that Googlebot is located in the US so if it crawled any of those sites it would always get a US IP address? So is the answer perhaps to go with Akamai for the .com only which should target the US market and use different / seperate C class hosts for the others? Thanks! Woj
Intermediate & Advanced SEO | | wojkwasi0