Robots.txt advice
-
Hey Guys,
Have you ever seen coding like this in a robots.txt, I have never seen a noindex rule in a robots.txt file before - have you?
user-agent: AhrefsBot
User-agent: trovitBot
User-agent: Nutch
User-agent: Baiduspider
Disallow: /User-agent: *
Disallow: /WebServices/
Disallow: /*?notfound=
Disallow: /?list=
Noindex: /?*list=
Noindex: /local/
Disallow: /local/
Noindex: /handle/
Disallow: /handle/
Noindex: /Handle/
Disallow: /Handle/
Noindex: /localsites/
Disallow: /localsites/
Noindex: /search/
Disallow: /search/
Noindex: /Search/
Disallow: /Search/
Disallow: ?I have never seen a noindex rule in a robots.txt file before - have you?
Any pointers? -
Never seen this, doubt it's any useful as this isn't part of any search engines recommended statements to use. I don't think this would have any impact on what search engine robots would look at as it's not a statement in the robots.txt documentation.
-
Best I could find was-
Unlike disallowed pages, noindexed pages don’t end up in the index and therefore won’t show in search results. Combine both in robots.txt to optimise your crawl efficiency: the noindex will stop the page showing in search results, and the disallow will stop it being crawled
From-https://www.deepcrawl.com/blog/best-practice/robots-txt-noindex-the-best-kept-secret-in-seo/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice for disallowing URLS with Robots.txt
Hi Everybody, We are currently trying to tidy up the crawling errors which are appearing when we crawl the site. On first viewing, we were very worried to say the least:17000+. But after looking closer at the report, we found the majority of these errors were being caused by bad URLs featuring: Currency - For example: "directory/currency/switch/currency/GBP/uenc/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL3dvcmt3ZWFyP3ByaWNlPTUwLSZzdGFuZGFyZHM9NzEx/" Color - For example: ?color=91 Price - For example: "?price=650-700" Order - For example: ?dir=desc&order=most_popular Page - For example: "?p=1&standards=704" Login - For example: "customer/account/login/referer/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL2NhdGFsb2cvcHJvZHVjdC92aWV3L2lkLzQ1ODczLyNyZXZpZXctZm9ybQ,,/" My question now is as a novice of working with Robots.txt, what would be the best practice for disallowing URLs featuring these from being crawled? Any advice would be appreciated!
Intermediate & Advanced SEO | | centurysafety0 -
Can you give me some advices to rank this domain?
Hi Moz community, I've a coleague that's working to rank this site: www.devsar.com. The selected keywords are:
Intermediate & Advanced SEO | | Gaston Riera
Mobile development
Web development
Django Development
Python Development I've checked the site: It's fast and clean. Has a good PA and DA. It's responsive and good lookking. Meta description , title, hreflang.. everything in order. Link profile a little rare (checked with ahref.com), it's because someone made a mistake redirecting some expired domain Can you help me to help my mate out?
Thanks
GR.0 -
Disallow URLs ENDING with certain values in robots.txt?
Is there any way to disallow URLs ending in a certain value? For example, if I have the following product page URL: http://website.com/category/product1, and I want to disallow /category/product1/review, /category/product2/review, etc. without disallowing the product pages themselves, is there any shortcut to do this, or must I disallow each gallery page individually?
Intermediate & Advanced SEO | | jmorehouse0 -
Pages getting into Google Index, blocked by Robots.txt??
Hi all, So yesterday we set up to Remove URL's that got into the Google index that were not supposed to be there, due to faceted navigation... We searched for the URL's by using this in Google Search.
Intermediate & Advanced SEO | | bjs2010
site:www.sekretza.com inurl:price=
site:www.sekretza.com inurl:artists= So it brings up a list of "duplicate" pages, and they have the usual: "A description for this result is not available because of this site's robots.txt – learn more." So we removed them all, and google removed them all, every single one. This morning I do a check, and I find that more are creeping in - If i take one of the suspecting dupes to the Robots.txt tester, Google tells me it's Blocked. - and yet it's appearing in their index?? I'm confused as to why a path that is blocked is able to get into the index?? I'm thinking of lifting the Robots block so that Google can see that these pages also have a Meta NOINDEX,FOLLOW tag on - but surely that will waste my crawl budget on unnecessary pages? Any ideas? thanks.0 -
Robots Disallow Backslash - Is it right command
Bit skeptical, as due to dynamic url and some other linkage issue, google has crawled url with backslash and asterisk character ex - www.xyz.com/\/index.php?option=com_product www.xyz.com/\"/index.php?option=com_product Now %5c is the encoded version of \ - backslash & %22 is encoded version of asterisk Need to know for command :- User-agent: * Disallow: \As am disallowing all backslash url through this - will it only remove the backslash url which are duplicates or the entire site,
Intermediate & Advanced SEO | | Modi0 -
We will be switching our shopping cart platform from volusion to magento and really cautious / nervous about our rankings / seo stuff. any advice for anyone that has migrated stores, etc. these urls are years old, etc.
shopping cart platform switch and SEO. What do you suggest? What's the best way to ensure we keep rankings.
Intermediate & Advanced SEO | | PaulDylan0 -
Advice on further SEO
I am frustrated by a lack of progress for a major keyword I want to rank for. I have made several pages, optimized with Onpage and even a whole site but I can't seem to get my ratings up. I am hoping somone can take a look at my pages and efforts and offer me some advice... Keyword is "National Currency" One site is devoted to this keyword: NationalCurrencyValues This site is ranked 30th and is down 9... and this page on another site is devoted to the same keyword ranked 26th is: http://www.antiquebanknotes.com/National-Currency.aspx
Intermediate & Advanced SEO | | Banknotes0 -
Need advice on local search optimization
HI all, I've found myself in a puzzling position and not quite sure which direction to push my current SEO project so if anyone who's done this particular type of SEO can offer some suggestions I'd be eternaly grateful. I am currently working on a project for a Law Firm based in New Jersey. Lets say the town they are in is Garfield. What I really want to try and achieve is see them appearing in the number one spot whenever anyone within Garfield or the immediate area searches for a lawyer relating to the individuals need. E..g searches like "personal injury lawyers", "real estate lawyer". The problem is I can see how I can easily make it to the number one position if people are specific and enter garfield in the search term but in reality they wouldn't be doing that. An additional problem is that peoples ISP's in garfield aren't located in Garfield, in some cases they're as far away as Newark so when they're doing a search for 'real estate lawyer' google is bringing up results for the Newark based firms. It seems using tools like market samurai to look at the traffic and competition is proving useless as searches like the ones I'm doing for local business are so closely tied to the ISP location I don't really know whether to target broad range searches like "Real Estate Lawyer", or to be really specific and include the town name in my page titles, H1 tags etc... I hope I put across my dilemma and someone can help me chose which direction to go in.. Thanks
Intermediate & Advanced SEO | | davebrown19750