Robots.txt advice
-
Hey Guys,
Have you ever seen coding like this in a robots.txt, I have never seen a noindex rule in a robots.txt file before - have you?
user-agent: AhrefsBot
User-agent: trovitBot
User-agent: Nutch
User-agent: Baiduspider
Disallow: /User-agent: *
Disallow: /WebServices/
Disallow: /*?notfound=
Disallow: /?list=
Noindex: /?*list=
Noindex: /local/
Disallow: /local/
Noindex: /handle/
Disallow: /handle/
Noindex: /Handle/
Disallow: /Handle/
Noindex: /localsites/
Disallow: /localsites/
Noindex: /search/
Disallow: /search/
Noindex: /Search/
Disallow: /Search/
Disallow: ?I have never seen a noindex rule in a robots.txt file before - have you?
Any pointers? -
Never seen this, doubt it's any useful as this isn't part of any search engines recommended statements to use. I don't think this would have any impact on what search engine robots would look at as it's not a statement in the robots.txt documentation.
-
Best I could find was-
Unlike disallowed pages, noindexed pages don’t end up in the index and therefore won’t show in search results. Combine both in robots.txt to optimise your crawl efficiency: the noindex will stop the page showing in search results, and the disallow will stop it being crawled
From-https://www.deepcrawl.com/blog/best-practice/robots-txt-noindex-the-best-kept-secret-in-seo/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt was set to disallow for 14 days
We updated our website and accidentally overwrote our robots file with a version that prevented crawling ( "Disallow: /") We realized the issue 14 days later and replaced after our organic visits began to drop significantly and we quickly replace the robots file with the correct version to begin crawling again. With the impact to our organic visits, we have a few and any help would be greatly appreciated - Will the site get back to its original status/ranking ? If so .. how long would that take? Is there anything we can do to speed up the process ? Thanks
Intermediate & Advanced SEO | | jc42540 -
Very Old Pages Creeping Up - Advice
We are currently having very old pages dating back 5+ years ago appearing on moz all of a sudden, we don't necessarily get traffic from these links anymore and i doubt they still hold any weight. Currently they take you to a 404 page, would there be any worth in redirecting these links?
Intermediate & Advanced SEO | | JH_OffLimits0 -
Advice on Content Marketing in a Tough Niche
Hello, In our niche, nobody links to the content/information with rare exceptions. Do you guys have any good articles/ideas for cases like this? The content that is linked to is once removed in subject matter from the content of our site, like if we sold shoes and had to write on different types of clothing stores. Looking for advice on what to do and how to figure out what to write about. We've probably got a descent budget this time but we're not sure how to go about this. Any advice is appreciated.
Intermediate & Advanced SEO | | BobGW0 -
Robots.txt help
Hi Moz Community, Google is indexing some developer pages from a previous website where I currently work: ddcblog.dev.examplewebsite.com/categories/sub-categories Was wondering how I include these in a robots.txt file so they no longer appear on Google. Can I do it under our homepage GWT account or do I have to have a separate account set up for these URL types? As always, your expertise is greatly appreciated, -Reed
Intermediate & Advanced SEO | | IceIcebaby0 -
Blocking out specific URLs with robots.txt
I've been trying to block out a few URLs using robots.txt, but I can't seem to get the specific one I'm trying to block. Here is an example. I'm trying to block something.com/cats but not block something.com/cats-and-dogs It seems if it setup my robots.txt as so.. Disallow: /cats It's blocking both urls. When I crawl the site with screaming flog, that Disallow is causing both urls to be blocked. How can I set up my robots.txt to specifically block /cats? I thought it was by doing it the way I was, but that doesn't seem to solve it. Any help is much appreciated, thanks in advance.
Intermediate & Advanced SEO | | Whebb0 -
URL Question and Advice on Site Architecture
Good morning one and all, i have a specific question pertaining to my Domain Migration Website URL structure. I have a computer repair business that I am re branding and my question at this point is centrally focused on how to best handle my URL naming structure that will best suite my needs for my the Search Engines and also my customers UX while not looking SPAMMY I am a web developer and SEO and I am building a SILO Site Architecture in WordPress using Pages (not Posts) so no discussion is need on the Permalink structure. I am attaching several Images below of Screen Shots of the new site that I have designed so that you may look at them and see the Silo Architecture Layout in action for the most part. OK, here we go. Looking at the Silo Mast Head, we can see that the following Main Menu items each represent a specific Silo Theme Silo Theme # 1 - COMPUTER REPAIR Silo Theme # 2 - VIRUS REMOVAL Silo Theme # 3 - PHONE REPAIR Silo Theme # 4 - NETWORKING Silo Theme # 5 - DATA RECOVERY My specific question is, if /computer-repair/ is a main silo theme (WP -Parent Page) and /laptop-repair/ is a (Child Page) of Computer Repair is the following example below (the actual URL string) going to 'trigger' a SPAM signal to either the user or GOOGLE or both?? URL String: http://www.pcmedicsoncall.com/computer-repair/laptop-repair/ Here's another example with the VIRUS REMOVAL SILO http://www.pcmedicsoncall.com/virus-removal/malware-removal/ Seeing how computer repair is the main silo theme that cannot be changed in the URL Structure (it can) but I wont change it seeing how COMPUTER REPAIR is the single largest keyword phrase used by individuals when they are looking for computer repair. Secondly, - LAPTOP REPAIR is also a Keyword Phrase that that has HIGH search queries that I am trying to rank for and that too (ideally) should also not changed! How do I deal with this situation? Or, am I seeing this in a overly paranoid way? I currently have the site allowing only my IP Address so I am afraid that the screen shots below is all that I can do on this in lieu of actually visiting the Site Currently, I have my URL Structure where Wilmington NC immediately follows the targeted keyword phrase for the Silo Theme like below http://www.pcmedicsoncall.com/virus-removal-wilmington-nc/malware-removal/ The example above, - including the location after the keyword phrase does look much more attractive and breaks it up so it does not read SPAMMY and it will help with SEO but yet another problem exists using the location after the keyword phrase which I explain in detail Below. On top of doing a complete re-branding Domain Change I am actually going to be relocating myself and my business to Charlotte, NC at the end of the summer so I have serious doubts if using Wilmington NC within the URL structure would be a wise idea considering that I will be relocating and an internal 301 Redirect on a Newly Migrated site 2-3 months after the initial site migration and site setup may have some negative impact and confuse Google and compound the situation thus much further despite the fact that it would immediately help me bounce back up with my rankings after the migration process. Thoughts a suggestions on both explained scenarios please? I have asked this specif question once already but obviously people do not read my very detailed and well thought out questions. This can also be viewed here>http://www.seomoz.org/q/need-very-urgent-advice-on-wedsite-migration-questions-please#reply_150847> Thank you Sincerely, Marshall Thompson SEOMOZ-PC-MEDICS-ON-CALL-1.jpg SEOMOZ-PC-MEDICS-ON-CALL1.jpg
Intermediate & Advanced SEO | | MarshallThompson310 -
Why specify robots instead of googlebot for a Panda affected site?
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
Intermediate & Advanced SEO | | nicole.healthline0 -
We've just bought a new domain - need advice on the exact procedure to follow...
Hi guys, We've just bought a 3 letter .co.uk domain to replace our current 20 character old domain. Our existing domain is PR5 with quite a few links (that we can modify no problem) We're currently .301 redirecting the new domain to the old domain. I was looking at the procedure in one of the guides but as it's slightly different - is this the correct procedure? 1. prep the duplicate site on new domain and prep the individual htaccess .301 redirects 2. Add new the domain to google webmaster tools bing Webmaster centre 2. On the switchover date - modify all possible incoming links from external sites 3. On the switchover date - apply the .301 redirects and make the new site live 4. On the switchover date - apply the new sitemaps to google & bing 5. on the switchover date - fill out the change of address form in webmaster tools 6. Do the happy dance? many thanks in advance, Tony.
Intermediate & Advanced SEO | | posh_tiger0